The four stages of Multi-Source Feedback

Multi-source feedback (MSF) is not a complicated procedure but it does involve a mix of moving parts. and it’s important to carry out the feedback process in a structured way to get the best result. The MSF process is set up in four stages to keep the exercise clear and organised for the candidate.

In this article, the stages will be split up and explained in detail. The four stages of multi-source feedback are:

Stage 1 – Objective and meaningful data collection

Step 1: The candidate selects their colleague reviewers and advises their feedback provider. The provider then guides the MSF candidate, if required, when they’re identifying their reviewers.

The candidate is generally in the best position to identify people who know them and have observed or experienced their practice. Given the formative nature of the MSF tool, the data are more useful if candidates identify reviewers who can offer a variety of perspectives and accurately assess their performance.

Step 2: The next step is for the candidate to nominate a supporting medical colleague (SMC). It’s recommended that the candidate selects an SMC who is familiar with their work but not directly responsible for managing them or evaluating their performance. The SMC role is informal but essential peer support throughout the MSF process.

Step 3: A patient feedback pack is issued to be administered by the candidates’ practice or administration support team (to a sample of the candidates’ patients). The patient feedback process is administered by the candidates’ practice or administration support team. QR codes and short URLs will be sent via email, and if selected, a paper pack will be posted to the address the candidate provides.

Step 4: The MSF candidate is guided through the process of identifying their reviewers. All identified reviewers must know the candidate and their clinical work well enough to give constructive and considerate feedback in a timely manner.

If digital collection is used, patient feedback data is more secure and more quickly processed, and as such, it’s the recommended approach for all MSF candidates.

Step 5: The candidate completes the Self-Assessment survey.

Step 6: The reviewers complete either the patient or colleague feedback component, which collects objective data about the MSF candidate’s observable workplace interpersonal skills and professional behaviours.

Undertaking the three parts that make up multi-source feedback

Multi-source feedback consists of three parts (or instruments) that work together to gather subjective and objective data. The three primary data collection methods include:

  • Patient Feedback
  • Colleague Feedback, and
  • Self-Assessment.

The following information provides details about the activities undertaken for each of the three feedback methods.

Patient Feedback Questionnaire

  • To ensure patient confidentiality, a nominated administrator will distribute the surveys to a minimum of 40 active patients and invite feedback.
  • The survey is easy to administer. It’s available in online and paper formats.
  • The online version is generally accessible via a unique URL or QR code, and is directly submitted via the secure portal.
  • The single-sided paper questionnaire is designed for the patient to complete and return via sealed envelope.
  • Both methods ensure patient anonymity and may be completed in about three minutes.
  • For report validity, a minimum of 30 completed, valid questionnaires are required. The responses are subsequently provided to and analysed by the survey provider to inform the development of the patient feedback results report.
  • Note: Ethics approval is not required for the survey since data collection is quality improvement-focused rather than research focused.34

Colleague Feedback Evaluation

  • The candidate provides the names and email addresses of 15 nominated colleague reviewers (typically five doctors, five other clinical colleagues and five managerial or administrative staff).
  • Each reviewer is invited to complete the 10-minute online survey.
  • A minimum of 12 responses are required for report validity.


The MSF candidate completes an individual 10-minute online Self-Assessment which is used to compare their perceptions of their own performance to their colleagues’ perceptions.

Stage 2 – Analysis and reporting

A survey provider like CFEP Surveys receives a candidate’s data (at the response rate required for data validity) and analyses it by source (i.e. patient, clinical colleague, and non-clinical co-workers).

Analysis and reporting retain anonymity. Free-text feedback is provided verbatim minus any personal identifiers.

Before completing MSF, it’s worth confirming:

  • Who will have access to the report?
  • Who will see the results?
  • Where the report will be stored?
  • Who will have access to it in the future?

If the candidate is completing a full MSF, the candidate or participating provider organisation, supervisor or clinical educator will receive a comprehensive MSF report, with the feedback from all three instruments as well as comparator data where the items on surveys are identical.

Example for General Practitioners

All Practice Experience Program (PEP) reports are sent to the candidate’s college only, who then upload them to the candidate’s portal.

  • For registrars, reports will typically be sent to the candidate and the college via separate emails.
  • For fellows (ACRRM, RACGP), a report will be sent to the candidate only. (Note: A copy won’t be supplied to the college.)

Example for Specialists

For registrars, reports will typically be sent to the candidate and the college via separate emails.

  • For participating fellows, a report will be sent to them directly.
  • For participating international medical graduates (IMGs), a report will be sent to the relevant college for distribution.

Where the candidate nominates a supporting medical colleague, it is the candidate who provides them a copy of the confidential MSF report.

Multi-Source Feedback Report

The MSF report gives a more rounded picture of the candidate’s performance, encompassing feedback from all three sources.

When the candidate is completing individual components of the MSF assessment, the will receive a copy (as appropriate) of the:

  • Patient feedback report – containing analysis of patient feedback, including benchmarking against your peers nationally, or
  • Colleague feedback report – containing the analysis of feedback from clinical colleagues and non-clinical co-workers, together with Self-Assessment comparators. Evaluation scores will be benchmarked against those of other participating clinicians.

Each candidate receives an introduction to the report. These outline how to read the report and interpret the data (providing sample copies of both the patient and colleague survey instruments, explaining data sources, analysis, and benchmarking), and orients the reader (MSF candidate or debriefer) to data-informed reflection.

Generally, reports contain, as appropriate:

  • Tabulated and graphical data and items showing how well the candidate scored in each item
  • Frequencies and average ratings for each item
  • Comparator data for self-ratings and the average colleague feedback rating
  • Free-text comments listed by question and source, minus any personal identifiers.

The graphical overview of results allows the candidate to decide whether they see themself as others see them.

It allows the candidate and debriefer to compare Self-Assessment scores with patient feedback on interpersonal skills and colleague perceptions of professionalism.

Benchmark data gives a meaningful national average for candidates to consider where they sit along the continuum. They get a sense of how they’re performing relative to other candidates who have completed MSF.

Patient Feedback Report

The patients’ report gives the results of the patient feedback questionnaires. It addresses the question: What is my patient’s experience of care? or put another way: How do my patients rate my interpersonal skills and the experience of care I provide?

The report outlines the distribution and frequency of ratings on a poor to excellent scale containing:

  • Satisfaction with visit
  • Warmth of greeting
  • Ability to listen
  • Explanations
  • Reassurance
  • Confidence in ability
  • Express concerns
  • Respect shown
  • Time for visit
  • Consideration
  • Concern for patients, and
  • Recommendation.

The patient feedback report provides percentage scores and benchmarks against the above categories supported by a more comprehensive analysis. The analysis outlines patient demographics and associates’ mean percentage scores, including:

  • Age
  • Gender, and
  • How many times the patient has seen the doctor.

Colleague Feedback Report

The colleague feedback report provides the results of the colleague surveys, which assess the MSF candidate’s professionalism. It addresses the question: How am I perceived by my colleagues?

Again, this report details the distribution and frequency of rating, mean percentage scores and benchmarks, highlighting where the MSF candidate sits in relation to other scores within the relevant benchmark dataset.

Similarly, it gives deidentified qualitative feedback to complement patient feedback and Self-Assessment. These results are designed to offer constructive feedback.

Some examples of qualitative colleague feedback include:

  • A valued member of the medical centre, very respected by her colleagues, patients and staff members.
  • The doctor is organised, respectful of her patients and other team members.
  • This doctor could improve in time, although this may be limited depending on the patient and their reason for concern.
  • Closure of wounds could be looser when tightening to assist with skin integrity and removal of sutures.

In addition, patient comments are collated, and qualitative data is presented for consideration.

Some examples of the patient comments include:

  • Very good doctor
  • The is nothing I could suggest. This doctor is an awesome doctor. She make you feel very comfortable and is care and concerned for me at every visit.
  • Move to a new area was made easy with this doctor, always there when needed and always happy to fir my kids in when they’re sick.
  • I genuinely commit to my overall experience with this doctor. I consider her my forever doctor.

Self-assessment report

The candidate completes the self-assessment instrument alongside colleague feedback and it’s included in the report for comparison. The self-assessment addresses the question: Do I see myself as others see me? It also captures personal reflections about strengths and opportunities for improvement.

The report finding compares the data of the colleague feedback side-by-side with the data from the self-assessment. This gives the participant a clear comparison of how they see themselves and their performance, against how their colleagues view them and their performance.

Performance reflection

In many MSF reports, a synthesis of results is presented in performance reflection tables. These highlight potential areas for personal development, growth and improvement, with more detailed information given in the body of the patient feedback or colleague feedback reports.

Stage 3 – Debrief and self-reflection

Research in medical education demonstrates that simply receiving a report on one’s performance is often insufficient to promote learning or a change in practice, even when gaps are readily apparent.38,97

While initial work in the development of MSF did not include facilitated debrief and self-reflection, the opportunity for candidates to discuss feedback is seen as critical to good outcomes from MSF.

As such, facilitated debrief and self-reflection, ideally performed by a trained coach, is the third stage of the MSF process and the precursor to developing a targeted improvement plan at stage.4,11,15,34,43,96,97,100-102

The factors that influence behaviour change to drive improvement include:21,46,103,104

  • Organisational culture
  • Perceptions of credibility of the data
  • Self-perceptions of one’s own performance
  • The candidate’s belief in their ability to change
  • Peer support
  • Context and improvement
  • System constraints, and
  • Emotional reactions to the data.

Making a personal, professional or practice change in response to performance data is complex and benefits from three specific interventions:

  • having a facilitated feedback conversation about the MSF report and data
  • adopting a coaching approach when considering the need for change
  • co-developing an annual action plan for professional development, growth and improvement.

An important and compulsory component of the MSF process is the candidate’s ability to meet with a trusted person to review the data and discuss the results. In recent implementation trials, 89 per cent of Royal Australian College of Physicians candidates agreed or strongly agreed that debrief was a valued component of the process.105

 “‘[The debrief] was without doubt the outstanding part of the process. The critical reflection I achieved in dialogue went vastly beyond the scope of what I could achieve looking at the report on my own and really crystallised some important but unrecognised professional issues for me.’

RACP MSF trial participant, 2017

Debrief and self-reflection goals:

  • inform the co-development of an annual action plan for continuing professional development
  • normalise the process of professional review and reflection as part of a supportive whole-system learning culture encompassing professional bodies and provider organisations
  • (and, for clinicians who are excelling) focus on an aspirational change or improvement they may wish to make.

Reflective learning is an essential component of professional practice. It involves considering the results and thinking about the experiences of patients and colleagues retrospectively in order to learn from them. The debrief helps the candidate to ‘unpack’ their MSF report; where possible this should occur as soon as they receive the report.

But debrief and facilitated reflection can be challenging, even when results are positive. It requires leadership, adopting a coaching approach, and effective interpersonal skills based on trust and respect. Research has identified characteristics of the feedback conversation that enhance its effectiveness.106-109

Characteristics of effective feedback conversations 106-109

  • Having a supportive and respectful relationship and environment.
  • Using open-ended questions to stimulate reflection and guide self-assessment and self-critique.
  • Exploring patient and colleague perspectives and the candidate’s reactions to the data, feedback and results.
  • Ensuring the MSF candidate understands what the data, feedback and results mean.
  • Identifying strengths and opportunities for personal development, growth and improvement.

Facilitated debrief and reflection help the individual candidate to:5,21,106-110

  • be more aware of the experience of care they provide
  • explore reactions to the feedback and better understand it means to them
  • translate new information into insights and knowledge about strengths
  • identify and implement areas and priorities for personal development, growth and practice improvement.15,34,101,102

When conducting facilitated debrief and fostering self-reflection, the facilitator or coach needs to pay attention to the skills required for handling both process and content:106,110

  • Process skills include:
    • reviewing the purpose of the program and session goals with the clinician
    • developing the relationship throughout the session
    • ensuring familiarity with the data
    • using communication micro skills to explore reactions to the results, clarify understanding and provide encouragement through active listening and open questioning
    • promoting reflection and Self-Assessment by bringing blind spots into focus
    • being flexible about the content to be discussed.
  • Content skills include:
    • collaborating to make sure the clinician is engaged in and committed to the discussion
    • goal setting and developing anticipated outcomes
    • creating a tailored action plan and a follow-up plan to monitor progress and ensure accountability.

The facilitated conversation should focus on the data in each report as well as triangulated data across colleagues (clinical colleagues and non-clinical co-workers) and Self-Assessment reports, or benchmarks with other participating clinicians.

For example, clinical colleagues may provide high ratings for punctuality and reliability but non-medical co-workers might rate those items lower. The discrepancy may provide an opportunity to ask the clinician about the difference.

Considering the candidate’s Self-Assessment report creates space for reflective discussion, especially when the Self-Assessment scores differ from their reviewers’ scores.44 A difference in results creates an opportunity for further exploration.

For clinicians who are excelling, this discussion could focus on an aspirational change or improvement they may wish to make personally, professionally or organisationally.

The diagram below illustrates the process adopted in this stage.

Note: The debrief happens before the reflective period and the personalised action plan are completed.

Clinicians have two options for supported debrief and self-reflection; formal debrief, and informal debrief.

Formal debriefing

The formal debrief approach adopts the Relationship, Reaction, Content, Coaching (R2C2) feedback model.21,46,100 A strength-based approach to facilitation, coaching and action planning.

This R2C2 model is founded on three theoretical perspectives – humanism, informed Self-Assessment and the science of behaviour change, and includes four phases:21,46

Theory and research inform each phase, to guide the feedback conversation and provide open questions to promote self-reflection, self-critique and self-direction.

The intention is that the facilitator or coach use the model iteratively, to explore the sections of the report that are most meaningful to the candidate and to the facilitator or coach. It also works to coach the individual through the process of co-developing a purposeful action plan.

Facilitated discussion is focused on the data in the MSF report about each of the three MSF roles: communicator, collaborator, professional. In upholding the formative nature of MSF, the facilitator or coach uses a coaching approach where discussions focus on the clinician’s relative strengths and improvement areas within each role.

Coaching in the MSF context is considered: ‘a one-to-one conversation focused on the enhancement of learning and development through increasing self-awareness and a sense of personal responsibility, where the coach facilitates the self-directed learning of the coachee through questioning, active listening, and appropriate challenge in a supportive and encouraging climate’.111

Some candidates might find considering results challenging. It’s important that a skilled facilitator or coach conducts the debrief and understands, plans and prepares for challenges, such as:

  • how to manage the session when the clinician hasn’t reviewed their data
  • how to manage the conversation when the clinician is resistant to change
  • how to work with a clinician who is clearly upset by the feedback, or alternatively considers themself an overachiever.

Facilitated coaching conversations may include open-ended questions, such as:99,101

  • Did you focus on particular sections of the feedback report? If so, please describe which sections you focused on and why.
  • What did you learn that was expected or unsurprising? Why was it expected?
  • What did you learn that was unexpected or surprising? Why was it surprising?
  • What did you find that seemed noteworthy or important? Why was it important?
  • Is there a gap between the care you want to offer and what the report suggests?

Completing the MSF tool and this reflection process can contribute to CPD requirements for the candidate and their supporting medical colleague.

For more information about CFEP Surveys’ formal facilitated debrief services, and to access a range of complimentary debriefing training tools, see or speak to the CFEP Surveys team.

Informal debriefing

This debrief conversation is typically conducted between the MSF candidate and their supporting medical colleague – a trusted medical colleague the candidate appoints to support them in this process.

Informal debrief allows opportunity to consider and discuss the report, reflect on results, and establish insights to inform subsequent action planning. In some cases, clinicians on training pathways receive debrief support from their clinical supervisor or medical educator.

Alternatively, a medical division or departmental head of service may provide debrief when a participating organisation sponsors the MSF program.

The format for informal debrief is not generally specified by the survey provider and the training provider or sponsoring organisation uses it at their discretion. CFEP Surveys recommends adopting a skilled coaching approach to focus on the clinician’s priorities for change, goals for improvement, and co-creating an action plan based on the performance data.48

There are a range of complimentary training resources covering MSF and in particular debriefing and the R2C2 model available to support your MSF journey. These include video demonstrations on how to deliver a debrief for candidates who have received both excellent and challenging peer reports, a series of micro-learning videos which cover common debriefing scenarios, ‘pearls and tips’ and the R2C2 model, and more. These can be accessed from our Multi-Source Feedback Resources page.

When conducting an informal debrief, we recommend the MSF candidate and their supporting medical colleague establish a shared understanding, outlining those principles and agreements they will uphold during the MSF process; for example:

  • Confidentiality – all data and reports will remain confidential, and discussion during the informal debrief will be conducted in strictest confidence to allow openness and honesty.
  • Respect – respectful consideration will be given to feedback from patients and colleagues, recognising their unique position, the spirit in which the feedback was provided, and a focus on learning to inform professional development, growth and improvement.


The MSF program aims to normalise the process of professional review and reflection as part of a supportive whole-system learning approach encompassing professional bodies, provider organisations and health care professionals.

The feedback report will also include guidance on conducting a reflective exercise to inform the development of a personalised action plan, plus tools and templates for considering results by source (i.e. patient and colleague) and results overall.

The recommended time for completing the self-reflection exercise is immediately following the debrief, and should be continued to be worked on and adapted by the candidate throughout the four-six week period of self reflection where the candidate trials new ways of working based on the feedback received and strategies discussed during the debrief.

The supporting medical colleague or medical educator may participate in this activity, informally reflecting on the candidate’s results and providing additional insight and support to them.

Completing self-reflection helps the candidate to prioritise areas for personal development, growth and improvement and develop the action plan. Once the candidate submits the completed reflective exercise to either the participating medical college or their survey provider, they can receive CPD recognition.

Stage 4 – Action planning and CPD allocation

The MSF process culminates in the co-development of goals for personal development, growth and improvement, and an evidence-informed action plan.4–7

Co-development is critical to the action plan’s success. The candidate must make a major contribution and feel they own their personalised action plan, while the coach contributes their experience and knowledge and facilitates the process and content development.

The role of the formal debrief coach is not to provide answers or solutions but to help prioritise areas for improvement, goal setting, opportunities and strategies for professional development, growth and improvement and to help document agreed changes in an annual action plan.

Research suggests action plans and anticipated outcomes are more likely to be achieved when co-developed, as opposed to being developed by only the candidate, facilitator or coach, or medical director.28,54,58,59

During the action planning process, the facilitator or coach may also connect the MSF candidate with learning resources and system supports to help them achieve their improvement goals.

Informed by MSF results and insights from the facilitated debrief session, the action plan describes the changes the candidate intends to make in the short term (over the next six to eight weeks) and across the medium term (over the next six to 12 months). It also captures the initial activity required for longer-term goals (12-plus months).

A multi-source feedback action plan template outlines a structure in which the candidate can capture information about:

  • resources they need to support the changes
  • the enablers and barriers to change
  • what success looks like, and
  • how and when they will know they have achieved success.

We highly recommend the candidate and the supporting medical colleague hold a follow-up facilitated coaching session one to two months following the development of the action plan to reflect on the impact of the changes made in the short term.

The true value of the MSF process is realised as candidates embed learnings into improved patient experience of care, clinical practice, and collegiate relationships.

Read more articles about

Become a patient feedback expert

Leave your details and receive the full edition of our Comprehensive Practice Accreditation and Improvement Survey Guide.

Become an MSF expert

Leave your details and receive the full edition of our Comprehensive MSF Guide