ICAEW.com works better with JavaScript enabled.

Student Insights

Insight from the examiners: Case Study exam

Author: ICAEW Insights

Published: 03 Feb 2022

Exam pass rate main

2021 saw the first two sittings of the Case Study exam on the new Professional and Advanced Level exam software. The Case Study Senior Examiners tell us how it went and how future students can learn from the experience.

For the first two Case Study sessions on the new exam software – House Pride (HP), July 2021; and Dog Gourmet Supplies (DGS), November 2021 – students who had prepared diligently were able to showcase their professional skills more effectively than ever before. By producing their calculations efficiently, they could put more effort into thinking and writing their narrative sections.

For many, this enabled them to earn more marks on the ‘right-hand side’ of the marking key, especially for Applying Judgement – an area in which weaker students often struggle. Stronger students differentiated themselves by extending their analysis work into reasoned argument, applying professional scepticism to add real value to the reader. This was apparent in both of the 2021 Case Study exams, for example in Requirement 3 in the HP exam (evaluation of a proposal to enter into a supply arrangement with Carey, a major UK builder of retirement accommodation). Those who planned their calculations and set them out succinctly could then compare the figures being presented with Carey’s previous projects, as well as with those described for a similar company (Antoine) in the Advance Information (AI). In doing so, they were able to highlight some of the potential issues for HP.

Developed analysis

Armed with the improved functionality of the spreadsheets, candidates now have an increased opportunity to demonstrate their higher competencies.

In some of the best scripts from both 2021 sessions, we saw excellent examples of flexing and sensitivity analysis. Students must always work first with the numbers presented in the scenario, but the intuitive functionality of the new software then allows them to perform a calculation quickly that shows the impact of altering one or two key variables.

This is illustrated by the first sample answer from July 2021, where alternative figures are provided using a different gross profit margin. This student went on to develop the analysis of assumptions at Requirement 2 and was suitably rewarded.

Similarly, in the first sample answer from November 2021, by deftly reworking the initial calculations in Requirement 3 using the spreadsheet functionality, the candidate showed quickly and efficiently the impact of stripping out the loss-making “paid ads” on overall profit, providing more insight in the subsequent evaluation.

However, efficiency doesn’t always translate into effectiveness. For example, in the November 2021 Requirement 2, many candidates were able to calculate an answer very quickly – but unfortunately the answer produced was incorrect. The spreadsheet functionality is certainly helpful, but it does not replace the need to demonstrate the competencies being assessed.

[For Requirement 3] …a surprising number [of candidates] produced incorrect figures. A large proportion missed the fact that Exhibit 21 provided the RRP/kg, not the actual revenue per kg. Candidates were expected to apply the discount… to arrive at DGS’s predicted revenue. In missing this, many calculated an ROI of 372%. This high figure alerted better candidates to the need to challenge their own numberwork, and some – to their credit – reworked their figures by carrying out some sensitivity analysis or break-even analysis.

(Examiners’ report on November 2021 Case Study)

Requirement 2 lent itself well to sensitivity analysis of some sort, but few candidates took the opportunity to explore how sensitive the decision was, based on numbers of which most candidates were – correctly – sceptical.

From 2022, the marking key will start to include credit for this type of developed analysis, provided that it is relevant.

Related articles