SQE Independent Reviewer Annual Report 2022/23
10 April 2024
Geoff Coombe, SQE Independent Reviewer
Purpose
In January 2019, the SRA created a role of Independent Reviewer of the SQE. The purpose is to provide external assurance to the SRA and its exam services provider Kaplan, that the SQE is fair, defensible and can command public confidence.
I have previously reported on the pilot stages during the design phase of the SQE and the first sitting of the SQE1 and SQE2 exams. Kaplan now holds multiple sittings of SQE1 and SQE2 each year, and this is my second annual report on those sittings. Its purpose is to provide a high-level overview of the observed performance of the examination processes and outcomes between October 2022 and September 2023.
As in previous reports, recommendations for improvements or enhancements are made, as well as encouragement to continue good practice where it has been observed. In making recommendations, I am guided by my judgement as to what is in the best interests of candidates taking the exams and those that rely on the outcomes of the exams to make informed decisions about candidates' capabilities.
Executive summary
The SQE assessments are high-stakes, complex to deliver and technically challenging to develop and sustain. It is to be expected, given this challenge, there would be some operational delivery issues as this new assessment beds in.
Overall, the 2022/23 delivery demonstrated year-on-year improvement, it was generally good. However a small number of service failures did occur, mainly at Pearson Vue test centres but these were of a lesser scale to those experienced in 2021/22.
There was also an isolated error regarding the quality of marking for one of the sixteen assessment stations for the April 2023 sitting of SQE2. While effectively resolved and of limited impact, it does require planned changes to reduce the likelihood of a repeat occurrence.
During 2021/22, Kaplan laid the foundations for successful delivery of the SQE to the high standards that candidates, stakeholders and the public expect. Many planned improvements were delivered by Kaplan and/or the SRA in 2022/23. There are further opportunities for improvement, as would be expected in just the second full year of delivery. The majority of these inevitably focus on exam creation and delivery processes.
Leaders of the SRA and Kaplan should maintain positive joint working across teams and continue to foster an ongoing culture of openness, especially when inevitable delivery issues arise. In the meantime, candidates, stakeholders and the public should have confidence that the SQE outcomes delivered in 2022/23 were fair and reliable.
Evidence was gathered through a mixture of:
- direct observation of a wide range of exam creation and delivery activities
- interviews with key staff at SRA and Kaplan, including senior assessors
- access to management reports and information produced by the SRA and Kaplan
- support and advice from the Independent Psychometrician.
In order to provide an annual overview, this report is broken down into the key activities which enable the delivery of the SQE exams:
- Exam creation and production
- Exam delivery and assessment
- Candidate services, reasonable adjustments, mitigating circumstances and appeals
- SQE in Welsh
- Standard setting, determining the pass mark and issuing results
- Quality assurance.
The key processes for successful exam creation and production continue to be in place. I have defined 'successful' as the assessments will be high quality, can be assessed reliably and are valid. A crucial aspect of validity is that the assessments require appropriate functioning legal knowledge (FLK) and simulate what a day one solicitor is likely to need to know and do. The SQE assessments are complex and technically demanding to create and deliver, especially given their high-stakes nature and context.
During 2022/23, Kaplan has expanded its academic team by appointing additional qualified solicitors. I have met a number of these new appointments while observing exam activities. I have found staff to be knowledgeable and committed to ensuring assessments are as valid and reliable as possible.
Two years ago, the SRA appointed and trained subject matter experts (SMEs), also all qualified solicitors. They have developed their working practices and aim to offer positive and constructive feedback to the academic teams at Kaplan. There remains further opportunity for Kaplan and the SRA to reflect on how feedback from the SMEs is most effectively received and used. This reflection should include when to receive feedback for it to be of most benefit to the process.
The interaction between the SMEs and Kaplan's academic staff should be reviewed to explore opportunities for:
- a better common understanding of all aspects of the exam creation process, where separate or joint professional development could be beneficial
- a common understanding of the rationale for final decisions made, for example, when finalising assessments prior to the exam.
For both SQE1 and SQE2 exams, the SRA and Kaplan have in place appropriate processes to create good assessments. These include maintaining the safeguards reported in my first annual report last year.
It remains important that the post-exam psychometric reviews are as thorough and forensic as possible. This is to make sure of future exam creation process improvements and to deliver assessments that are demonstrably world-class in the high-stakes context they operate within.
Overall, the exam creation and production processes were effective and this is evidenced by:
- The psychometric data analyses.
- The effective pass/fail standard setting process that was conducted.
- Candidates' feedback that generally the assessments were fair.
Progress has been made to enable a wider and more diverse representation within SQE assessment writers, however there is room for further progress.
The SQE comprises two parts:
- SQE1 requires candidates to sit two assessments, each are 180 one-mark questions, assessing FLK, which require the candidate to select the single best answer out of five possible answers.
- SQE2 requires four oral assessments, taken at a small number of locations across England and Wales, plus twelve written assessments taken at Pearson VUE test centres worldwide.
During the period covered by this report, there were two sittings of the SQE1 and three sittings of the SQE2 exams.
SQE1 and SQE2 written exams took place in 51 countries, as well as at many assessment venues in England and Wales. For the vast majority of candidates, the exam delivery worked effectively. A small minority of candidates experienced issues when using a Pearson VUE test centre.
SQE1 exams enable computer marked candidate responses, thus providing a robust, very effective and highly reliable method for assessing the FLK. SQE2 exams require qualified solicitors and trained actors, who act as assessors, to be trained to a common standard of marking and use their professional judgement when applying the marking criteria.
With the exception of SQE2 oral exams and a small number of access arrangements for candidates with a particular requirement, all exams are delivered on-screen via a Pearson VUE test centre across the UK and worldwide. This process worked more effectively in 2022/23, however there were, as is almost inevitable, a number of on-the-day delivery issues. The vast majority of these were sorted out so that candidates could proceed with their scheduled examination, albeit with some delay occasionally.
The notable exception to this was in July 2023 candidates, when 53 candidates were unable to sit the SQE1 exams on the original date scheduled at one venue. In response to this, alternative dates were offered, soon after the initial scheduled exam date. Most candidates accepted and successfully sat the exam and were provided with their results at the same time as all other unaffected candidates. A small minority decided to take the exam at the next sitting.
Following a more serious incident at the Hammersmith test centre in 2021/22, Kaplan and Pearson Vue implemented a series of changes to improve preparation and on-the-day communications. This included a more robust review process after each exam sitting using RAID (risks, actions, issues and decisions) logs to prioritise future improvement.
While these changes did lead to an improved candidate experience further improvements can be made. I observed a more thorough approach to capturing issues arising on exam days and Pearson Vue showed a commitment to respond to learnings from each exam series in 2022/23.
In my last annual report, I recommended that a spell-checking function should be made available to candidates when completing SQE2 written exams. Kaplan has stressed the urgency of this functional improvement to the on-screen exam delivery system to Pearson Vue.
As this system is used by a wide range of Pearson Vue clients, with varying and differing requirements, it is understood that changes can take some time to be implemented, not least because they need to be thoroughly tested before making them available in the live context. At the time of writing this report, I understand that Kaplan was awaiting an update from Pearson Vue. It is important that Kaplan maintains its efforts to prioritise this functional improvement and I am hopeful that progress may be made in the next twelve months.
In the absence of a spell checker function, markers of the SQE2 written exams have been provided with additional guidance. This advice was provided to reduce the risk of giving candidates the benefit of the doubt when faced with spelling, grammatical and other typographical errors in candidate responses and to promote greater consistency of approach.
Without a spell-checking function, the exams do not accurately replicate the context within which a day one solicitor would operate and require markers to make difficult judgements. Provision of this function would reduce the risk of crediting candidates who cannot communicate at the appropriate competency level.
SQE2 oral exams were delivered at test centres located in Cardiff, Manchester and two sites in London. These sites are fully managed by Kaplan. SQE2 oral assessments are logistically complex, requiring the assessor and candidate to be face-to-face in an appropriately secure and confidential space, while candidates are quarantined for that assessment task/day.
As was the case in 2021/22, I observed these assessments were generally well conducted and expertly delivered. I visited venues at Manchester and Euston in London during the course of the year. I also received feedback from SRA's staff and SMEs after their observation visits in other locations.
During 2022/23, I observed several SQE2 written and oral assessment assessor and marker standardisation and calibration meetings. Assessors are generally well prepared for the difficult task of ensuring marking is reliable ie that a candidate would receive a very similar assessment outcome regardless of the assessor allocated to review their work.
During this time, Kaplan implemented and trialled a number of changes to enhance the checks for reliable marking. These included:
- further exemplification of the Threshold Standard
- new and additional processes to check the quality of marking both before SQE2 written markers start marking and during the marking period, which included the use of 'seeded' or common scripts and further statistical analyses.
In making these changes Kaplan is focussed on the end goal to assure itself that all markers are making to the same standard and demonstrate consistency in the application of marking criteria throughout the marking period.
In bringing about these changes in 2022/23, I have observed continuous learning and improvements. As the academic team, responsible for setting assessment and overseeing marking, at Kaplan contains a significant number of staff who are still relatively new to post there is a need to continue investing in their professional development.
As is planned, it is important that Kaplan has training and support in place to continue to develop the knowledge and understanding of the technical and theoretical aspect of assessment design and delivery using in-house support from psychometricians and relevant external academic research literature and/or assessment experts. This was a recommendation I also made in 2021/22 and further progress further progress is needed in 2023/24.
In September 2023, shortly after the results of the SQE2 April 2023 sitting were released, a small number of candidates raised enquiries about the detailed pattern of results they received. The enquiries noted a number of zero scores, across the individual marking criteria, for one of the sixteen assessments.
An initial investigation by Kaplan established there was an isolated error in a set of the marking of one of the 16 assessment stations in the April 2023 SQE2, Business Case and Matter Analysis. In addition to remarking the scripts where there was evidence of error, Kaplan and the SRA decided that a formal review was needed t of the marking of the full set of 346 scripts in which the error was found to provide assurance that no other errors had occurred. I observed the discussions between the SRA and Kaplan about the appropriate action.
I am satisfied that thorough and timely action was taken to investigate the issue raised, including the approach to reviewing the marking using suitably qualified experts and when implementing and approving the final outcomes of the review. Following the review, one candidate received a change of grade overall for SQE2 from fail to pass. It is, of course, very unfortunate to find an error after results are issued and to require such a review of marking.
The primary concern should be, and was, for any candidate affected, where there was a change in mark. The SRA and Kaplan are very aware of the potential impact of any error in marking, however limited, on a candidate and the potential impact on their career.
Kaplan has conducted a thorough review to understand why the issue was not picked up earlier in the process, prior to issue of results. While there was sampling of the quality of marking, as planned, the samples selected did not pick up any significant errors in marking. However, there was a statistical report available to the Kaplan team which could have led them to further investigate the number of candidates being scored as zero for all criteria within the full set of 346 scripts.
I am satisfied that this lesson has been learned, and additional steps are planned for the review of SQE2 marking during and immediately after the marking period, and before results are issued. This will require Kaplan academic staff to receive additional training to better interpret the psychometric data available to them, which was a recommendation in my annual report last year. It is, of course, essential that both this training and additional quality assurance and monitoring steps are implemented before future SQE2 sittings.
It is important that Kaplan continues to invest in the growth and development of its academic team, especially those leading calibration and standardisation activities. This should include the continued development of technical assessment expertise and ensuring best practice in other high-stakes professional exams that lead to a license to practice is understood and learned from. As well as using psychometric expertise and analysis of data relating to SQE exam outcomes to inform where improvements to the assessor standardisation, calibration and checking processes may be made.
During 2022/23, good progress has been made in developing psychometric analysis, with different ways of exploring potential bias, item analysis, and trends over time, and in reporting, as more data becomes available and different analyses become possible. This work has been very well supported by an Independent Psychometrician employed by the SRA.
Analyses around potential bias continue to be a focus and should continue to be so. There is room for further development of the review of assessment items that are 'flagged' for further investigation by these analyses. This is so that, for example, those carrying out the analyses have access to the very best information and training and those carrying out such reviews are themselves from diverse backgrounds. This is important when item analyses point to a differential in performance for demographic groups.
The psychometric examiner analyses and the post-exam examiner survey have also been developed further this year. Some of this will be based on cumulative data, which is possible now after three or more sittings. These data provide further opportunities to learn from, refine and improve exam creation and delivery processes and it is important that Kaplan use these sources to finalise its training and development plans for academic staff in 2024.
Overall, candidate services processes have performed well, and the approach to obtaining candidate feedback remains excellent. However, some candidates continue to raise concerns about the booking process for exams. As with other processes and services, there was evidence of continuous improvement throughout 2022/23.
Kaplan has established good web services, and its candidate services team demonstrates a strong commitment to providing good service to candidates. Recent candidate surveys suggest a measurable improvement in their experience during 2022/23.
However, an ongoing concern raised by a minority of candidates has been the booking process. At the time of finalising this report, further concerns were raised about the booking process for the SQE1 exam due to be held in January 2024. This issue will be featured in the next annual report.
If and when things go wrong, the Kaplan team continues to work quickly and effectively to find appropriate solutions, treating each candidate as an individual and tailoring its approach to that individual.
Kaplan now has additional senior capacity in its candidate facing teams including the appointment of an Equality Manager and a Quality Manager. These appointments demonstrate a commitment to prioritise a consistent and high-quality candidate experience, as well as accommodating the ongoing increase in candidate numbers, and therefore the number of candidate enquiries raised.
Reasonable adjustments (RAs) are offered to candidates that request and need them. The most common adjustments continue to be:
- extra time
- sole use assessment room
- access to medicine/snacks/water during the assessment.
Additional and new bespoke provisions were also arranged where evidence supported this. There was evidence that the relevant team in Kaplan was careful to address each individual's needs, with effective interaction between the two parties to agree on the nature of any adjustment.
As in 2021/22, on a very small number of occasions, candidates reported the agreed adjustment plan was either not in place when they arrived at the test centre or was not satisfactory. This included one occasion where additional new functionality did not work as effectively as hoped, despite good planning.
Kaplan remains committed to finding additional solutions to meet individual candidate needs and read/write assistive technology support was offered for the first time. It reviews and responds to candidate feedback and seeks support from specialist support organisations as appropriate. It is also working with Pearson Vue to prioritise changes to its systems to enable a wider range of RAs to be accommodated.
On occasion, separate, standalone solutions which did not use the Pearson Vue system were used in 2022/23. Ideally, such exceptions will be minimised over time as improvements are made. What is critical is that the adjustment in place is suitable for each candidate's needs without conferring any unfair advantage or disadvantage to that candidate.
Overall, the RAs process appears to have worked well; this includes careful monitoring of outcomes for the overall cohort compared to candidates with RAs. In 2024, it is planned for additional forms of the SQE1 exams to be made available. This will provide options for some RA candidates to have a bigger gap between their FLK1 and FLK2 exams, which will respond to some feedback requests.
If a candidate believes they have suffered some material disadvantage while taking an exam, they may present a mitigating circumstance claim. The majority submitting a claim cited 'a mistake or irregularity in the administration or conduct of the assessment'.
I once more observed a very thorough and painstaking approach taken when considering each claim, with each claim given very careful consideration before being accepted or rejected by the Assessment Board. Case histories for various scenarios that are accepted or rejected have been built up, and past decisions are now referenced to maintain consistency over time.
Candidates have three attempts to pass both SQE1 and SQE2. I remain concerned that as they run out of opportunities, there will be an attempt to 'game' the system by submitting spurious mitigating circumstances requests.
To maintain the integrity of the process, it will need to withstand the risk of the volume of spurious requests delaying processing times while protecting the interests of candidates who raise legitimate requests. I am satisfied that Kaplan will continue to closely monitor and respond to this risk.
I have seen evidence of ongoing continuous improvement actions, including revising the claim form to provide more support and direction as to the information that needs to be supplied. This aims to streamline the process and reduce the occasions on which the Kaplan team needs to raise further queries, post claim submission.
Should a candidate wish to, they may appeal the outcome of their assessment on grounds of either:
- there are mitigating circumstances which could not have been put before the Assessment Board before it made its decision or
- the decision reached by the Assessment Board or the manner in which that decision was reached involved material irregularity and/or was manifestly unreasonable and/or irrational.
There were 26 First Stage Appeals (for SQE2 Oct 22 and SQE1 Jan 23). Two were upheld. From the evidence available, I believe the process and policy was appropriately followed and cases were given full consideration.
After each assessment sitting, Kaplan issues an online survey for candidates to provide feedback on their experience. This survey and the relevant follow-up activities provide valuable feedback, and Kaplan has shown some improvements through 2022/23, with candidate satisfaction improving across a range of measures.
Overall, Kaplan does an excellent job of gathering feedback from candidates after each sitting. It has demonstrated a commitment to prioritise and implement improvements that provide most benefit to the candidate service overall.
The SQE will be offered in the medium of Welsh in a phased implementation. This started in 2022 with the option of SQE2 oral and written assessment responses to be provided in Welsh by candidates. There was no take up of this option in 2022/23.
Candidates will be able to access SQE2 questions which have been translated into Welsh from October 2023 onwards.
SQE1 is planned to be offered in Welsh for the first time in January 2025. As part of Kaplan's preparation for this, a SQE1 Welsh pilot took place on 28 June 2023 across six Pearson Vue centres, including five centres located across Wales.
The main aim of the pilot was to test specific aspects of conducting an SQE1 assessment in the Welsh language. The participant feedback was the most important output from the pilot. Participants were not asked to prepare for the assessment in the same way as when sitting an actual SQE1 assessment. For this reason, and also given the relatively small group of candidates, there was no standard setting and candidates were not provided with results.
The pilot used 90 SQE1 sample questions. These were translated by a professional Welsh translation company and then reviewed by a panel of Welsh-speaking solicitors. Meetings were held between Kaplan SQE Subject Heads and the Welsh-speaking solicitors in order to refine the translation of the questions.
As a result of certain terms being translated into Welsh and the possibility of some of the same words having different meanings, it was important to deal with this in the presentation of the questions. They were presented in two ways:
- Some questions had a particular term(s) underlined and were accompanied by an on-screen assessment-specific glossary; and
- Some of the questions had a particular term(s) underlined and were accompanied by the English translation in brackets within the question.
A total of 28 candidates sat the pilot. One additional candidate was unable to test at the Swansea centre and so reviewed the questions separately. All 29 candidates completed the feedback survey which found that:
- The pre and post assessment communication and organisation were good.
- There was a clear preference as to how the questions were presented (option 2 above).
- There was a mixed response in relation to the quality of the translation of the questions, with some feeling the translation was too formal and that certain terms were difficult to understand.
- 11 candidates said they would choose to sit SQE1 in Welsh rather than English.
In September 2021, an SQE2 pilot was held to test the operational aspects of conducting SQE2 in Welsh. This involved SQE2 written and oral legal skills assessments. A small number of Welsh-speaking solicitors were recruited for the project and these individuals have now joined the pool of SQE2 examiners as assessors and/or markers. Following the pilot, the SQE2 in Welsh was phased in.
The first opportunity that candidates had to provide answers in Welsh for all SQE2 stations (both oral and written) was the October 2023 sitting. The assessment questions for the oral stations were available in Welsh for the first sitting of the October SQE2 only.
The assessment questions for the written stations were also translated from English into Welsh, as well as all exam-related materials and operational information (eg candidate instructions and guidance on using the Pearson Vue platform).
Not all candidate-facing documents were translated from English into Welsh. Materials that were subject to copyright, or where no Welsh version of official documents existed (eg UK primary or secondary legislation), were not translated. The approach reflects real life as far as possible and ensured that no candidate was disadvantaged by the provision of materials that included documents in the English language only.
Where candidates were required to complete documents provided in English (eg an official form), a note in the assessment materials was provided to remind them that they would not be penalised if they provided their answers or responses in English. A glossary of legal terms was also included as part of a candidate's assessment materials. This contained translations of words or phrases that a candidate might otherwise struggle to understand.
All translated documents for the October 2023 SQE2 in Welsh were proofread for linguistic accuracy. In addition, freelance Welsh-speaking solicitors, with relevant subject matter expertise, reviewed the translated assessments in detail and any issues (eg ambiguity in terminology) were resolved by way of further discussions with a Kaplan subject head or member of the academic team.
In the event, no candidates wished to take the October 2023 SQE2 in Welsh. And before the SQE2 in Welsh was fully implemented (1 September 2023), no candidate provided answers in Welsh for the SQE2 written stations/attendance note or made oral submissions in Welsh.
The process for translating assessment documentation is still under review. Kaplan intends to provide more guidance to candidates on the SQE website on sitting the SQE2 in Welsh (eg on the type of documents that may be translated). It also plans to work more closely with the SMEs, including the SRA SME with responsibility for SQE in Welsh when in post.
My observations led me to the conclusion that thorough planning is taking place to prepare for delivery of all aspects of the SQE in Welsh. It is hoped that some candidates may start to opt for SQE2 to be delivered in Welsh in 2024.
Decisions as to where to set the pass marks strictly adhered to the processes and policies set out in advance. The basis for the processes and policies followed well established standard setting techniques, as appropriate for a high-stakes assessment leading to a license to practise. The processes continue to be supported by excellent analyses of the psychometric data and comprehensive reports to support the Assessment Board in determining the pass marks. Overall, the outcomes appear to be fair and defensible.
In preparing for Assessment Board meetings, which I observed, Kaplan provides good management reports which summarise a wide range of psychometric data. These include item and station level analyses and associated measures of test reliability.
In preparing for each Assessment Board, a Standards Advisory Group (SAG) was established to scrutinise the provisional technical data that is presented to the Assessment Board. This group, which comprises a sub-set of the membership of the Assessment Board (not including the Chair), is able to request additional information to investigate any potential issue arising in order to help the Assessment Board make the best-informed decisions. This was a welcome and useful development in 2022/23.
The SAG was important when determining the pass boundary for the July 2023 SQE1 FLK1 and FLK2 exams. A key aspect of determining the pass boundary is to maintain standards over time, from one sitting to the next and over the longer term.
For the July 2023 SQE1 exams there was evidence that candidates had found some of the questions slightly more demanding than was the case in previous sittings. Both the standard reports generated for the Assessment Board routinely and additional psychometric analyses that were requested by SAG to explore this issue demonstrated this.
These analyses also confirmed that candidates appeared to be of broadly similar ability to previous cohorts. They provided confidence to the Assessment Board to approve slightly lower pass marks than previous sittings, thus taking in to account the slightly more demanding questions. This resulted in similar pass rates to previous sittings, thus not unfairly penalising candidates for being faced with the slightly more demanding assessment on this occasion. The approach taken provided me with confidence in the outcomes of these decisions, they were fair to all candidates (this and previous cohorts), while maintaining standards over time.
Throughout the period of this report the assessment pass marks and pass rates appear appropriate for these high stakes exams. Kaplan continues to conduct very good analyses of outcomes by various protected characteristics and improvements have been made to ensure greater accuracy and definition of the self-declared candidate demographic information.
As I have reported previously, differences in outcomes by ethnic group exist, with white candidates generally achieving a higher pass rate than other groups. I continue to find no evidence of unfairness or bias in any process connected with generating the SQE outcomes.
Indeed, I am reassured that this issue receives close attention and further reviews of data over time are planned. The volume of demographic data is starting to grow as more exams take place.
Interrogating these data, while continuing to seek independent external support from the University of Exeter, is critical to seeking answers as to why this differential exists, and more importantly, what can be done to close this achievement gap in the future. The SRA has commissioned this university to research and investigate what the reasons might be for this attainment gap in their work to review legal professional assessment over time.
In my opinion, it is highly likely that in determining any action needed to attempt to close the gap, in addition to any actions Kaplan or the SRA might take, parties outside of those associated with running the SQE will be needed to affect meaningful change. This is because an exam is a snapshot of attainment at a period in time; it is possible that differences in performance by ethnic group will, at least in part, be due to prior levels of support and access to expert resources as well as wider societal opportunities available to candidates.
Kaplan and the SRA need to determine how much data are needed, in addition to receiving the outcomes of the (now started) second phase of investigation from the University of Exeter, to determine a realistic timescale to create a plan for action. In the meantime, as recommended in the 'Exam delivery and assessment' section, there is room for further development of the review of assessment items that are 'flagged' for further investigation, including where ethnic group performance differences exist.
During 2022/23, Kaplan continued to adopt comprehensive quality assurance procedures and I observed numerous and meaningful improvements based on the learning from the previous year. The overall delivery of the SQE was generally good in 2022/23 and an improvement from the previous year.
Effective quality assurance checks continue to be in place, with improvements being made such as responding to candidate feedback about reasonable adjustments. Checks were undertaken for all key processes. Both Kaplan and the SRA delivered continuous improvements based on learning and committed growing resources to assurance activities.
I reported last year that SRA had commissioned SMEs to offer assurance to the SRA that Kaplan's assessment creation and production and other key processes were appropriate and of the required quality. This work demonstrated useful outcomes, allowing Kaplan to benefit from detailed observations, which they carefully considered.
As this is still a relatively new working process and relationship, both Kaplan and SRA leaders have reflected on how to get the most of these experts and how their observations can be used in the most timely and helpful way.
I recommend the leaders of both organisations review the plans to get the most out of this work, by early 2024. This is in order to evaluate progress and to ensure the assurance observations offer the most benefit to the assessment production outcomes.
As part of this review, there should be a renewed commitment by leaders to ensure the culture within both organisations remains:
- hungry to improve
- fully committed to flagging and exploring issues arising, including errors or mistakes,
- one where staff are encouraged and rewarded for being open and honest in a timely manner if and when mistakes or errors occur.
Maintaining such a culture takes repeated effort, especially if errors occur, but maintenance is essential to protect the interests of candidates preparing for and taking the exam and wider stakeholders who rely on the outcomes.
As I have mentioned previously, these exams are high-stakes and complex to deliver, and it is inevitable that issues will arise during their operation. An aspiration to be world-class in the provision of these professional exams that lead to a license to practise requires constant effort from leaders to encourage and reward everyone working on the SQE to maintain such a culture.