In June of this year the BDMA held its first ever round of professional examinations for accredited damage management technicians and senior technicians. A detailed look at how the association's examinations are developed will, no doubt, answer some of the questions that potential candidates may have.
The BDMA examinations procedure has been developed by its education committee, whose members are drawn from various sectors of the industry.
Steven Richford, committee chairman, said the first task was to define the syllabus which the exams would cover: “This was done by bringing together existing best practice, core disciplines as identified by relevant certifying authorities such as the IICRC and ASCR, and input from the insurance and loss adjusting industries.”
Exam questions were compiled from eight of the BDMA's founding organisations. Some contributions were written to address specific parts of the syllabus, while others were taken from companies' existing training programmes.
The syllabus is divided into modules, which define the knowledge content of a particular subject or component. Since the syllabus is educational and generic, all questions are stripped of any reference to proprietary products or processes, before they are grouped according to which syllabus modules they relate to.
Richford said creating questions proved to be a demanding task, particularly those offering multiple choice answers. He said: “Alternatives must be reasonable enough not to make the answer self-evident.”
He stated the BDMA wanted to avoid the route of some certifying authorities, which use identical questions each exam.
Richford said the BDMA wanted to ensure candidates did not gain an unfair advantage through prior awareness. It believes its method of rotating a large bank of questions allows a more realistic assessment of a candidate's knowledge and experience.
On average, 20% of multiple choice and 50% of short answer questions will be changed for each examination. These will go back into the question bank and become available for future random selection.
A small team is responsible for selecting questions from the bank, each dealing with certain modules. Only two education committee nominees will see a paper in its entirety.
In order to monitor and maintain the appropriateness of the examinations, data is collected on the overall response to each question. Richford said data monitoring and feedback information were used to remove or adapt questions that did not meet the criteria of relevance and level of difficulty, ensuring a continually changing and updated test environment.
The marking of papers is carried out by a small team drawn from the education committee. Richford stresses that exam scripts are identified only by a candidate's number and are completely anonymous to the examiner.
The full education committee is presented with the marked papers, at which time they agree certification.
Borderline results are reviewed for the consistency of the marking. It is only after the results have been confirmed that the candidates are identified by name.
Candidates who have not achieved the required standard are able to resit the examination, providing they do so within one year.
Richford commented that the processes were designed to ensure the highest level of security and integrity. He said: “The examination standard reflects a level of competence in which clients can have every confidence.”
After the first round of examinations in June, notable industry figures agreed that it was extremely demanding.
“Having taken thousands of examinations, I did not expect to find myself still writing five minutes before the end,” reflected Dr Barry White, technical director of Belfor UK. “It was tough, which is as it should be if accreditation is to be meaningful.”
The BDMA syllabus is available from the same address.