[Skip to content]

Sign up for our daily newsletter
The Actuary The magazine of the Institute & Faculty of Actuaries

Education strategy your views

The year 2005 will bring major changes to the actuarial examination system (for details see the profession’s website, or p11 of the April 2002 issue of The Actuary). So when I had to do some ‘original research’ as part of a part-time postgraduate diploma, it seemed a good time to survey views of students and recent qualifiers about the current exam system and the proposed changes. SIAS, and the profession’s committee looking at the new education strategy, were both very interested

1,001 viewpoints
Well, actually 1,057. To get as many views as possible, I decided to use email, with a link to the survey on a website (www.247survey.com). This approach proved to be popular, with a response rate of about 25% (look out for other consultation by this method in future!) and many commenting that they would like more communication by email. A good cross-section of members replied (see figure 1) and it was great to get this level of response (despite the workload of analysing it all!). A total of 70% felt they were already well informed regarding the education strategy review, but about 15% of student respondents and 20% of fellow respondents had not heard of it at all prior to the survey.

The current state of play
The good and the bad
The good news is that about 80% of respondents were proud to be associated with the actuarial profession and 95% thought ‘respected qualification’ a key advantage of the current system. However, over 75% of students and almost 60% of fellows felt that the exams test ability to pass exams more than ability as an actuary, and only about 30% of students (60% of fellows) believed the system was good preparation for an actuarial career! Personal experience of the exams was negative/very negative for 38% of students and 24% of fellows responding.
The main advantages of the current system were seen as:
– Respected qualification
– Flexibility in order and pace of exams
– Relevance of (at least some) material
The main concerns regarding the current system were:
– Time to get results
– Uncertainty of time to qualification
– Difficulty of process
The fact that candidates need to study all areas was seen equally as a pro and a con.
The introduction of subject 305 (Finance and Investment) from April 2003 seems to have been a popular move. Fewer than 10% of respondents said they would not choose that exam in preference to one of the other 300 series if they were starting again now. We look forward to seeing the first paper in April.

I give up
A worrying ‘over 30%’ of respondents had seriously considered giving up the exams at some point, with a further 20% vaguely considering it. It won’t come as any surprise that failure/difficulty was the top reason why, but there were also feelings of disillusionment with the system, that work/life balance had suffered too much, it was taking too long, or candidates did not understand why they had failed. Several people had moved to jobs where the qualification was no longer needed and others felt that the exams content and/or style of assessment were not relevant to their work.

‘It’s not fair!’?
It is easy to say ‘it’s not fair’, but what do we mean by the fairness of an exam system? (48% students, 62% fellows believe the current system is a fair way of assessing candidates.) From the questions I asked, the top three fairness factors seemed to be:
– Consistent chance of passing at each session
– Questions relate to clear syllabus
– Assessment differentiates between good and bad candidates
Interestingly, ‘syllabus related to work done by actuaries’ was down at number four (some academics think this is the key factor), and indeed 72% of the fellows valued the fact that the exams had covered actuarial areas not directly connected with their current work. Other comments suggested that other factors such as knowing why one had failed, and no possibility of cheating were also important. However, most respondents felt that an exam system could be fair even if there were a choice of assessment method, or some assessment were not under exam conditions.
And how do we react if we believe that the exam isn’t fair? Unsurprisingly, I found that views that the exams were unfair/irrelevant were correlated with negative personal experience of the exams, greater consideration of giving up, less current/intended involvement in the profession, and less pride in the profession.

The future
Overall, the reaction was positive to the proposed 2005 changes, although there were a number of concerns to address. In particular, the transitional arrangements were a key concern of students (details of these are due to be published in December), together with possible perceived inequity between old and new systems. Responses on three specific aspects are discussed in more detail below.

Alternative assessment methods
Many respondents had experience of alternative assessment methods in the past (eg as part of a university course) in particular dissertations/projects, vivas/oral exams, open-book exams, and computer-based assessments. Views on these were mixed, but there was a strong feeling that some of the exams could be made more relevant to work by using alternative assessment techniques. Many respondents thought several ideas were worth investigating, in particular the following:
– Open-book exams Comments about open-book exams were very positive particularly mentioning that it was more like the work situation to be able to refer to, say, guidance notes, and that it focused questions on the key skill of application. Several commented that time constraints meant candidates still needed to know the material well (and open-book exams could be harder than closed-book ones).
– Modelling exam and IT-based assessment A modelling exam was seen as work-related and positive. There were concerns regarding cheating if the exam was unsupervised and practical issues regarding whether everyone has the same chance to prepare for the exam. Cheating was the main concern regarding IT- based assessment too (although this could perhaps be addressed by using supervised centres, similar to those used for the UK driving test theory exam). Immediate results would be popular!
– Viva for repeated FAs The idea of vivas in some circumstances was viewed positively overall, but there were concerns regarding the subjectivity of the testing, the pressure on candidates, and practical issues (eg workload of assessment, overseas students, how soon after the exam could it take place?). In addition, a number commented that ‘fail is fail’, fearing a devaluation of the qualification.
– Optional dissertation/project Dissertations/projects caused strong positive and negative reactions: some loved them, some hated them. They were felt to allow deep learning and test key skills and to give (at least some) candidates more chance to perform. The idea of compulsory dissertations was unpopular, particularly because of the time it would be likely to take, and there were concerns regarding cheating and consistency of marking (and availability of markers) for any dissertation. A viva was seen to help minimise the possibility of cheating.
Following these results, the Education Strategy Implementation Committee is seriously investigating open-book (or part open-book) exams for the specialist applications subjects. Also being investigated are:
– the possibility of vivas for repeated fails in some circumstances;
– on-demand computer-based assessment (at supervised centres to avoid cheating) for two early subjects;
– an assessed modelling course as part of core applications;
– whether a supervised dissertation could be an alternative offered at specialist level.
Any alternative assessments are likely to be introduced on a trial basis initially.

‘It’s good to talk’
Or perhaps ‘it’s good to sit a written paper on communication skills, possibly with a choice of questions’? For while communication skills was judged a subject where alternative assessments were considered feasible, when asked specifically about the best way of testing the skills, 36% preferred an exam, either in the current form or with a choice of questions, followed by 22% preferring an assessed course. Others were fairly evenly split (about 10% each) between presentations, a portfolio of evidence, a specific work experience requirement, or testing as part of other exams. There was strong support for testing communication skills at some point: over 80% of fellows and 70% of students believed or strongly believed that communication skills should be tested (see figure 2). An assessment of communication skills is to be retained in the new strategy (exact form still to be confirmed).

‘It’s good to work’
There was very strong support, both from Faculty and from Institute members, for having a work experience requirement. A total of 94% of respondents thought that work experience was a valuable part of actuarial training and only 2% preferred no requirement. When asked about the preferred length of any requirement, the mode was three years, median four years, and mean 3.8 years (full results in figure 3). In addition, over half said they would prefer more structure to the requirement (only 17% were against this), and there was no material difference between student and qualified views on this.
The Professional Competence Taskforce will be looking at the practicalities/details of implementing a more structured work experience requirement from 2005.