Independent review of the Australian Transport Safety Bureau's investigation methodologies and processes

1 December 2014

Mr. Martin Dolan
Chief Commissioner
Australian Transport Safety Bureau
62 Northbourne Avenue
Canberra ACT 2601

Dear Mr. Dolan,

I am pleased to provide you with the Transportation Safety Board of Canada's (TSB) report on the independent objective peer review of the Australian Transport Safety Bureau (ATSB) investigation process and methodology.

The review was conducted in accordance with the agreed upon terms of reference as specified in our July 2013 Memorandum of Understanding. This report constitutes the complete and final deliverable.

I thank you for the trust and confidence that the ATSB has placed in the TSB by asking us to conduct this review. We were pleased to be of assistance and to share our expertise. We also appreciated the opportunity to learn from the ATSB.

Sincerely,

Original signed by Kathleen Fox
Kathleen Fox
Chair

Executive summary

Introduction

The Australian Transport Safety Bureau (ATSB) conducted an investigation into the ditching of a Westwind 1124A near Norfolk Island, Australia, that occurred on 18 November 2009. Shortly after the release of the final report (AO-2009-072) in August 2012, a television programme questioned the quality and findings of the investigation. This prompted an Australian Senate committee to conduct an inquiry into the investigation, which yielded a number of unfavourable findings and recommendations for change.

At the ATSB's request, the Transportation Safety Board of Canada (TSB) agreed to conduct an independent, objective review (called from now on the TSB Review) of the ATSB's investigation methodologies and processes. Under the terms of reference, the TSB would not reinvestigate the Norfolk Island occurrence, but would review how the investigation was conducted, and do the same for two other investigations similar in scope in order to provide a useful comparison. Consequently, the TSB Review examined these three occurrences:

  • AO-2009-072 – Ditching – Israel Aircraft Westwind 1124A aircraft, VH-NGA, 5 km SW of Norfolk Island Airport, 18 November 2009 (Norfolk Island)Footnote 1
  • AO-2011-166 – Helicopter winching accident involving an Agusta Westland AW139 helicopter, VH-SYZ, 16km WSW of Wollongong Airport, NSW, 24 December 2011 (Kangaroo Valley)Footnote 2
  • AO-2010-043 Collision with terrain – Piper PA-31P-350, VH-PGW, 6 km NW of Bankstown Airport, NSW, 15 June 2010 (Canley Vale).Footnote 3

The review encompassed the full investigation timeline: the initial notification, the data collection phase, the analysis phase, and the production and release of the final report.

What we found

Comparison of ATSB and TSB methodologies

The organizations' methodologies have similar foundations, and focus on identifying the systemic deficiencies, individual actions, and local conditions that result in occurrences. Similarities in Australia and Canada's socio-economic cultures and legal traditions have resulted in similar enabling legislation, overall organizational structures, definitions, internal processes and robust methodologies that are linear, iterative and flexible.

The TSB Review compared the two organizations' methodologies against the standards and recommended practices outlined in Annex 13 to the International Civil Aviation Organization (ICAO) Convention on International Civil Aviation, and found they met or exceeded the intent and spirit of those prescribed.

The Norfolk Island investigation

The TSB Review of the Norfolk Island investigation revealed lapses in the application of the ATSB methodology with respect to the collection of factual information, and a lack of an iterative approach to analysis. The review also identified potential shortcomings in ATSB processes, whereby errors and flawed analysis stemming from the poor application of existing processes were not mitigated.

First of all, an early misunderstanding about the responsibilities of the Australian Civil Aviation Safety Authority (CASA) and the ATSB in the investigation was never resolved. This led to the ATSB collecting insufficient information from Pel-Air to determine the extent to which the flight planning and monitoring deficiencies observed in the occurrence existed in the company in general.

Poor data collection also hampered the analysis of specific safety issues, particularly fuel management, company and regulatory oversight, and fatigue (the ATSB does not use a specific tool to guide investigation of human fatigue).

Weaknesses in the application of the ATSB analysis framework resulted in those data insufficiencies not being addressed and potential systemic oversight issues not being analyzed. Ineffective investigation oversight resulted in issues with data collection and analysis not being identified or resolved in a timely way.

All three peer reviews conducted on the Norfolk Island draft report identified issues with respect to factual information, analysis, and conclusions. Many of these concerns were never followed up after the review process was complete. The ATSB process does not include a second-level review to ensure that feedback from peer reviews is adequately addressed.

After investigation reports have gone through peer and management review, they are sent to directly involved parties (DIPs) for comments. In the Norfolk Island investigation, the DIP process was run twice: once when the report was in its initial draft, and the second time after it had been revised. However, there is no process to ensure that the ATSB communicates its response to DIPs' comments. Formal responses to DIPs increase their understanding of the action taken in response to their submissions, and may make them more amenable to accepting the final report.

In the Norfolk Island investigation, the Commission's review of the report took place immediately after the first DIP process was completed, 31 months after the occurrence. At this stage in an investigation, it is difficult to address issues of insufficient factual information since perishable information will not be available and the collection of other information could incur substantial delays. At the ATSB, the Commission does not formally review some reports until after the DIP process is complete, and in any event, there is no robustly documented process after the Commission review to ensure that its comments are addressed before the report is finalized. Both of these aspects of the ATSB review process increase the risk that deficiencies in the scope of the investigation and the quality of the report will not be addressed.

A safety issue was identified in this investigation concerning insufficient guidance being given to flight crews on obtaining timely weather forecasts en route to help them make decisions when weather conditions at destination were deteriorating. When the safety issue was presented to CASA, it was categorized as "critical", but in the final report it was described as "minor", which caused significant concern among stakeholders. The TSB Review observed that this shifted the focus of the discussion to the label and away from the issue itself—and the potential for its mitigation.

In the final stages of the investigation, senior managers were aware of the possibility that the report would generate some controversy, but communications staff were not consulted and no communications plan was developed. Once the investigation became the subject of an external inquiry, the ATSB could no longer comment publicly on the report, which hampered the Bureau's ability to defend its reputation.

The response to the Norfolk Island investigation report clearly demonstrated that it did not address key issues in the way the Australian aviation industry and members of the public expected.

The Kangaroo Valley and Canley Vale investigations

The review of the Kangaroo Valley and Canley Vale investigations showed that when the ATSB methodology is adhered to, and the component tools and processes to challenge and strengthen analysis are applied, the result is more defensible.

In contrast to Norfolk Island, the Kangaroo Valley and Canley Vale investigations underwent regular critical reviews and used the ATSB analysis tools effectively, which gave rise to well-documented decisions, and revised data collection plans and analyses. In the Canley Vale investigation, additional information collected as a direct result of a critical review guided informed decisions with respect to the investigation of regulatory oversight.

In the Kangaroo Valley investigation, the target timeline outlined in the ATSB Safety Investigation Quality System (SIQS) for a Level 3 investigation was exceeded, despite significant effort by the team to expedite the investigation. This may indicate that these targets are unrealistic, the investigation was incorrectly classified, or that other work had influenced the published investigation schedules. Significant delays in completing an investigation increase the risk that stakeholders' expectations with respect to timeliness will not be met.

Nevertheless, because of the teams' active engagement with stakeholders in the Kangaroo Valley and Canley Vale investigations, expectations with respect to schedules were well managed and timely action was taken on safety issues.

In the Canley Vale investigation, events prior to the occurrence raised questions with respect to regulatory non-compliance and oversight. The report states that issues of regulatory non-compliance did not contribute to the occurrence, and the analysis tools were indeed effectively used to support this. However, the report could have benefitted from a more thorough discussion to clarify the underlying rationale for this conclusion.

Unlike Norfolk Island and Kangaroo Valley, the Canley Vale investigation included a closure briefing, which provided an opportunity to discuss lessons learned.

Recommendations

The TSB Review is making 14 recommendations to the ATSB in four main areas:

  • Ensuring the consistent application of existing methodologies and processes
  • Improving investigation methodologies and processes where they were found to have deficiencies
  • Improving the oversight and governance of investigations
  • Managing communications challenges more effectively.

Recommendation #1: Given that the ATSB investigation methodology and analysis tools represent best practice and have been shown to produce very good results, the ATSB should continue efforts to ensure the consistent application and use of its methodology and tools.

Recommendation #2: The ATSB should consider adding mechanisms to its review process to ensure there is a response to each comment made by a reviewer, and that there is a second-level review to verify that the response addresses the comment adequately.

Recommendation #3: The ATSB should augment its DIP process to ensure the Commission is satisfied that each comment has been adequately addressed, and that a response describing actions taken by the ATSB is provided to the person who submitted it.

Recommendation #4: The ATSB should review its risk assessment methodology and the use of risk labels to ensure that risks are appropriately described, and that the use of the labels is not diverting attention away from mitigating the unsafe conditions identified in the investigation.

Recommendation #5: The ATSB should review its investigation schedules for the completion of various levels of investigation to ensure that realistic timelines are communicated to stakeholders.

Recommendation #6: The ATSB should take steps to ensure that a systematic, iterative, team approach to analysis is used in all investigations.

Recommendation #7: The ATSB should provide investigators with a specific tool to assist with the collection and analysis of data in the area of sleep-related fatigue.

Recommendation #8: The ATSB should review the quality assurance measures adopted by the new team leaders and incorporate them in SIQS to ensure that their continued use is not dependent on the initiative of specific individuals.

Recommendation #9: The ATSB should modify the Commission report review process so that the Commission sees the report at a point in the investigation when deficiencies can be addressed, and the Commission's feedback is clearly communicated to staff and systematically addressed.

Recommendation #10: The ATSB should undertake a review of the structure, role, and responsibilities of its Commission with a view to ensuring clearer accountability for timely and effective oversight of the ATSB's investigations and reports.

Recommendation #11: The ATSB should adjust the critical investigation review procedures to ensure that the process for making and documenting decisions about investigation scope and direction is clearly communicated and consistently applied.

Recommendation #12: The ATSB should take steps to ensure closure briefings are conducted for all investigations.

Recommendation #13: The ATSB should provide clear guidance to all investigators that emphasizes both the independence of ATSB investigations, regardless of any regulatory investigations or audits being conducted at the same time, and the importance of collecting data related to regulatory oversight as a matter of course.

Recommendation #14: The ATSB should implement a process to ensure that communications staff identify any issues or controversy that might arise when a report is released, and develop a suitable communications plan to address them.

ATSB best practices to bring back to the TSB

The TSB Review identified a number of best practices and ways to potentially improve investigation operations at the TSB. An internal analysis will determine which practices could be adopted, and to what extent they could be incorporated into the TSB investigation methodology.

1.0 Introduction

1.1 Background

The Australian Transport Safety Bureau (ATSB) conducted an investigation into the ditching of a Westwind 1124A that occurred on 18 November 2009 near Norfolk Island, Australia. After the final ATSB investigation report (AO-2009-072) was released, a television program raised questions about the investigation. In the wake of this broadcast, an Australian Senate Committee conducted an inquiry, which made a number of unfavourable findings as well as several recommendations for change.

The ATSB approached the Chair of the Transportation Safety Board of Canada (TSB) and asked the TSB to conduct an independent review into ATSB investigation methodologies and processes (TSB Review). The TSB, which saw it as a mutual learning opportunity that could lead to improved investigation processes in both organizations, agreed to conduct the review.

The objectives of the TSB Review were twofold:

  1. Provide the ATSB with an independent, objective peer review of its investigation process/methodology and of the application of its methodology to selected occurrences (including the Norfolk Island occurrence).
  2. Identify best practices in both organizations that could be shared to improve existing processes and methodologies.

1.2 Terms of reference

The focus of the TSB Review was investigation process and methodology; the TSB was not to reinvestigate the Norfolk Island occurrence and would do its work independently of any other person or organization.

The terms of reference (TOR) set out five areas of inquiry for the TSB Review:

  1. The TSB was to conduct a comparative analysis of ATSB and TSB investigation methodologies, including the approach for risk assessment of safety issues, comparing them against the relevant provisions of Annex 13 to the Convention on International Civil Aviation. Strengths (best practices) and weaknesses (gaps) of each methodology were to be identified.

Next, the TSB was to review the Norfolk Island and other ATSB investigations to identify strengths and weaknesses in four areas:

  1. Application of the investigation methodology
  2. Management and governance of the investigation
  3. Investigation report process
  4. External communication.

The TOR established that the TSB was to be fully responsible for conducting the review and for analyzing the information collected, and to do so independently and objectively. The final report would present the independent views of the TSB.

1.3 Review methodology

The following tasks were completed as part of the TSB Review:

  1. Documentation review: The TSB reviewed the supporting documentation for both organizations' methodologies and processes so that it could compare them.
  2. Identification of ATSB investigations: The ATSB was asked to propose two investigations to the TSB Review that were similar in scope to the Norfolk Island investigation and could be usefully compared with it.
  3. Data and information collection: The TSB Review team collected data and information to clearly identify how the three investigations were done; sources included the following:
    1. Interviews with ATSB personnel: Extensive interviews were conducted with the investigators in charge of the three investigations, as well as with other investigators involved in them, and ATSB managers and commissioners.
    2. Workspace and investigation documentation review: The TSB was given access to the electronic workspaces for the three investigations, which allowed a thorough review of all available documentation.
  4. Analysis: The TSB Review team constructed a detailed sequence of events for each investigation, and compared the progress of the investigation with the processes and methods set out in ATSB documentation. Deviations from the described methodology were examined to identify the reasons for them or to identify potential best practices.
  5. Reporting: This report was prepared to present the findings of the TSB Review. Before the report was published, it was reviewed by a TSB steering committee set up at the TSB for this purpose and by the ATSB for factual accuracy.

2.0 Comparison of ATSB and TSB methodologies

2.1 Analytical approach

This analysis compares and contrasts the overarching methodologies used by the ATSB and the TSB, including the methodology used to assess risk associated with safety issues; its objective is to identify potential weaknesses, strengths, and best practices.

The source documents for this analysis are each organization's published methodologies.

2.2 Overview of methodologies

The ATSB and the TSB's methodologies have a similar theoretical foundation, and focus on identifying systemic deficiencies, individual actions, and local conditions that result in occurrences.

The Australian and the Canadian organizations operate in similar socio-economic cultures and legal traditions, giving rise to similar enabling legislation that prohibits assigning blame, provides privilege for on-board recordings and witness statements, and requires public reporting. There are some legislative differences, but these are outside the scope of this analysis.

These similarities and the nature of safety investigations result in the TSB and ATSB methodologies having similar definitions, internal processes, and overall structure. Both

  • are generally very linear, following a path through input (occurrence), process (investigation), and output (safety communication);
  • are iterative, with numerous loops back to earlier processes or repetitive applications of processes;
  • allow for overlap of processes throughout an investigation; and
  • use a standard, defined terminology.

Both organizations have detailed documentation of their respective methodology. The ATSB Safety Investigation Quality System (SIQS) is documented in a set of manualsFootnote 4 with a designated member of the investigation staff assigned to oversee and coordinate the administration of the system and its documentation. At the TSB, in contrast, responsibility for creating and maintaining the various documents (e.g., analysis methodology manuals, manuals of investigation, etc.) is spread throughout the modes and a multi-modal Standards and Training group.

The TSB has developed manuals for specific investigation topics, including the investigation and analysis of organizational and management influences, as well as the investigation of human fatigue, but the ATSB does not have such guidance for its investigators.

Both the ATSB and the TSB train investigators in their respective methodologies and their application. Training was outside the scope of this review.

The TSB and the ATSB have developed and implemented proprietary software applications based on their methodologies. The TSB Review found that investigators at both organizations use workarounds if the tools are cumbersome or difficult to use.

Both organizations aim to integrate their methodologies into day‑to‑day investigative work, which can be challenging. For example, the review of TSB safety analysis differs among the modal Standards and Performance groups; modes differ in the degree to which safety analysis tools are used; and there is frequently no critical review of the safety analysis before the report is drafted. At the ATSB, some IICs work with their teams to review the analysis against evidence tables before the report is drafted, which is not required by the methodology. Then, as the peer reviewers and team leader review a report, the extent to which they consult the analysis tools varies.

The ATSB and the TSB analysis software tools differ in that ATSB analysis makes use of embedded evidence tables to validate both the existence of safety factors and their influence on an occurrence. The evidence tables can also be used to assess the importance of safety factors that were not causal or contributory. In contrast, the TSB approach does not generally use evidence tables.Footnote 5

Both the ATSB and the TSB have exceeded target timelines in completing investigations and releasing reports.

2.3 Notification and assessment processes

Australia and Canada have legislative/regulatory requirements specifying the occurrences that must be reported. Similarly, both the ATSB and the TSB have staff assigned 24/7 to receive notifications and start assessment processes to determine the extent of response required. The organizations assess many of the same factors before making a decision whether to investigate.

The ATSB may rescind a decision to investigate if it is determined that there is no safety value to be gained from continuing the investigation, provided it issues a statement setting out the reasons for discontinuing the investigation. The Canadian Transportation Accident Investigation and Safety Board Act mandates that once an investigation is undertaken, the TSB must follow through to make findings and report publicly.

Both the ATSB and the TSB have several degrees of occurrence classification, and determine the level of investigation primarily in consideration of the potential to advance transportation safety. Other factors in the decision include the anticipated level of resources, the complexity of the investigation, the time required to complete it, the level of public interest, and the severity of the occurrence.

TSB classification levels have not been changed since 1994, and are currently under review. Reports made public (classes 2, 3, and 4) always include findings. Reports with only factual information (Class 5) are provided to coroners where applicable, but are not publicly released. In contrast, the ATSB recently added two additional occurrence classification levels; one provides for a limited-scope investigation, for which a report of factual information is published without analysis or findings.

2.4 Investigation/Data collection processes

The methodology review revealed many similarities and no significant differences in the organizations' investigation/data collection processes. For example, they

  • do not commence on-site activities until it is safe for the investigation team to enter the site;
  • have authority to control access to the site, and to compel persons or organizations to provide information;
  • may collect data on anything that could affect safety;
  • conduct data collection as an iterative process;
  • collect the same types of data and have the same types of collection capabilities;
  • have legislated provisions to prevent the unauthorized release of information and to prevent information from being used in courts of law or disciplinary proceedings.

2.5 Sequence of events analysis

2.5.1 Processes

The ATSB and TSB methodologies both involve the development and analysis of a sequence of occurrence events, and recognize that this work will be iterative and likely lead to collection of more information and additional analysis. In both cases, analysis starts from the sequence of occurrence events to look for higher-level deficiencies in risk controls or organizational influences. Both organizations have developed proprietary software tools to document the sequence of events and conduct the analysis.

Both methodologies include preliminary analysis, which guides the scope of the investigation and data collection. The document describing ATSB's preliminary analysis methodology is substantially more detailed than the TSB's, which is very high-level.

The ATSB makes routine use of evidence tables to validate both the existence of safety factors and their influence on an occurrence. The evidence tables can also be used to assess the importance of safety factors that were not causal or contributory. The TSB does not make regular use of evidence tables.

2.5.2 Risk assessment of safety issues

Once a systemic deficiency or safety factor has been identified, both methodologies use very similar processes to examine the risk it poses: they take into consideration existing risk controls or defences before defining the worst credible scenario (ATSB) or adverse consequence (TSB) (Table 1), and they determine the severity and likelihood of the consequence before rating the risk. The organizations differ, however, in the methods they apply to estimate the consequence component of risk.

Table 1. Levels of consequence used in the determination of risk
ATSB worst credible scenario TSB adverse consequence
Catastrophic Catastrophic
Major Major
Moderate Moderate
Minimal Negligible

The ATSB determines consequence level using two criteria:

  • the severity of the occurrence (e.g., incident with no injuries versus accident involving loss of aircraft and multiple fatalities), and
  • the type of transport operation involved.

According to the ATSB's guidelines, the inclusion of the type of operation "reflects the ATSB policy of prioritizing its investigation activities towards those issues that present a threat to public safety and are the subject of widespread public interest."Footnote 6 This prioritization is put into effect by applying a 4Í4 matrix, which captures the type of operation and the outcome, to help determine one of the possible four consequence levels (Table 2).Footnote 7

Table 2. Criteria used by the ATSB to determine the level of consequence when conducting a risk assessment
  Minimal Moderate Major Catastrophic
Air transport >5,700 kg (fare-paying passengers) Minor incident only (e.g. birdstrike) Incident Accident; Serious incident; Incident with many minor injuries Accident with multiple fatalities, or aircraft destroyed plus fatalities/serious injuries
Air transport >5,700 kg (freight); Air transport <5,700 kg (fare-paying passengers) Incident Accident; Serious incident; Incident with many minor injuries Accident with multiple fatalities, or aircraft destroyed plus fatalities/serious injuries N/A
Other commercial operations Accident; Serious incident; Incident with many minor injuries Fatal accident; Accident with aircraft destroyed or multiple serious injuries N/A N/A
Private operations Accident with aircraft destroyed or multiple serious injuries Fatal accident N/A N/A

Depending on the size of aircraft involved and the type of transport operation, occurrences involving fatalities or injuries are assigned a different consequence level.

The TSB approach to determining severity of a consequence takes into account other factors in addition to type of operation and outcome; these include injury to other people or damage to property and the environment, as well as effects on commercial operations.

Another difference between the methodologies lies in the scale used to rate the likelihood of the worst credible scenario (ATSB) or adverse consequence (TSB): the ATSB uses a four‑level scale whereas the TSB uses a five‑level scale (Table 3).

Table 3. Levels of likelihood used in the determination of risk
ATSB TSB
Frequent Frequent
Probable
Occasional Occasional
Rare Unlikely
Very rare Most improbable

Once the risk assessment is complete, the two organizations tailor their safety communications according to risk level, with higher risk prompting higher-level communications.

2.5.3 Internal review of analysis

The ATSB methodology specifies that investigations must undergo peer, management, and Commission review before a final report is released.Footnote 8,Footnote 9 The peer review checklists require a review of all findings and the underlying evidence tables. Management review assesses the adequacy of each peer review process and the standard of the investigation report against ATSB quality objectives. However, the ATSB executive review and approval process is focused primarily on the report:

The Executive review process is essentially an approval process and detailed review of all elements of the investigation and supporting material should not be required. The Executive expects that once the Manager has recommended to the General Manager that the report is suitable for release; the Executive should only need to undertake a high level review of the report and its findings to satisfy itself of the completeness of the investigation, check standardisation aspects and consider broader safety and strategic implications associated with release of the report.Footnote 10

Processes to ensure TSB investigations are reviewed include the following:

  • Occurrence workload management tools guide each investigation through the sequence of reviews.
  • Modal Standards and Performance groups have the function of reviewing all investigations, including documentation of analysis.
  • The TSB Manual of Investigations, Volume 3, Part 4, "Report Writing Standards", provides guidance for writing and reviewing reports, and includes the requirement at each level of review to verify that documentation is complete.

2.6 Report production processes

The two organizations' methodologies are similar in that the investigator in charge (IIC)/investigation team prepares a draft investigation report, which then undergoes review and approval processes.

As described below, the TSB has formal mechanisms to ensure that there is a response to questions or comments arising from a report review by designated reviewers and the Board—and that the responses themselves are reviewed. The ATSB has no similar mechanisms.

All TSB draft reports are reviewed by the Board before being released for external review, whereas not all ATSB draft reports are reviewed by the Commission in this phase.

Both organizations send a draft report to external parties for comment, although the ATSB distribution is slightly wider than the TSB's. While both agencies prepare responses to comments submitted by reviewers, this response process represents another substantial difference between the ATSB and TSB methodologies.

At the TSB, the IIC coordinates the preparation of a designated-reviewer response package that contains a response to each of the reviewer's comments as well as a draft of the Board decision as to whether or not the report should be revised in consequence. The response package is reviewed at peer, management, and Board levels to ensure that the response to each representation is appropriate.

Then, in compliance with paragraph 24(4)(d) of the Canadian Transportation Accident Investigation and Safety Board Act, when the final report is released, the TSB sends each designated reviewer a response package containing the responses to their particular submission.

The ATSB process requires the IIC to put the submission comments in a table along with the response to the comments. Unlike the TSB's, ATSB reviews do not include a systematic review by senior managers or the Commission of the IIC's responses; nor does the ATSB provide any feedback to submitters on how the organization responded to their comments.

2.7 External communications processes

The ATSB and TSB have processes in place to communicate safety deficiencies immediately to stakeholders. They also prioritize their safety communications according to risk: the deficiencies with the highest risk are addressed by recommendations, which are their highest level of safety communication.

In addressing safety issues, both agencies prefer encouraging relevant organizations to initiate safety action proactively to issuing formal safety recommendations or safety advisory notices. However, recommendations or safety advisories are issued in the event that safety action has not been taken or the safety action taken has not mitigated the risk to the extent the organization believes appropriate.

Both the ATSB and TSB publish recommendations on their respective websites. However, the ATSB also publishes all safety issues identified in ATSB reports in a searchable format. The ATSB tracks and assesses safety action taken to address all significant and critical safety issues, as well as action in response to safety recommendations. In contrast, the TSB formally tracks responses to recommendations, safety advisory letters, and safety information letters, but assesses responses to recommendations only.

2.8 Summary: Comparison of ATSB and TSB methodologies

The ATSB and TSB methodologies are both robust investigative systems focused on identifying and analyzing the sequence of events. They have many similarities because of their common philosophical underpinnings. Both organizations work toward ensuring their methodologies are integrated into day‑to‑day investigative work, which can be challenging.

It is the differences between the two approaches that provide opportunities to identify best practices, however. The following three differences are discussed in more detail later in this report:

  • There are differences in the quality-control processes employed by the two organizations.
  • The ATSB makes effective use of evidence tables that provide an excellent basis for substantiating or refuting hypotheses/findings and for writing the analysis portion of an investigation report. The TSB does not systematically use evidence tables.
  • In the ATSB's risk assessment framework, the criteria used to determine the severity of consequence are more heavily influenced by the aircraft size and type of operation than they are in the approach taken by the TSB.

2.9 Comparison of TSB and ATSB methodologies with ICAO Annex 13

Australia and Canada are both signatories to the International Civil Aviation Organization (ICAO) Convention on International Civil Aviation (Chicago, 1944). The Standards and Recommended Practices for Aircraft Accident Inquiries were first adopted by ICAO on 11 April 1951 pursuant to Article 37 of the Convention, and were designated as Annex 13 to the Convention.

Contracting States are bound to apply all standards outlined in Annex 13, unless a difference has been filed. Australia has filed several differences; most are administrative, and primarily concern the protection and sharing of information in accordance with Australian law. Canada has not filed any differences against Annex 13.

The last ICAO safety oversight audit of Australia's civil aviation system in February 2008 made two findings concerning air accident and incident investigations: the selection of occurrences to be investigated, and the medical examination and toxicological testing of surviving flight crew, passengers, and involved aviation personnel after an accident. Neither of these findings was considered relevant to the TSB Review.

The methodologies and processes used by Australia and Canada are generally considered to meet or exceed the intent and spirit of the Standards and Recommended Practices in Annex 13.

3.0 Review of the Norfolk Island investigation

3.1 Factual information related to the Norfolk Island investigation

On the night of 18 November 2009, an Israel Aircraft Industries Westwind 1124A operated by Pel-Air Aviation Pty Ltd (Pel-Air) was on an aeromedical flight from Apia, Samoa, with a planned refuelling stop at Norfolk Island. The aircraft, unable to land at the aerodrome due to weather and with its fuel about to be exhausted, was ditched in the ocean approximately 3 nm west of Norfolk Island. On board were the captain, first officer, a doctor, a flight nurse, the patient, and one passenger. All were able to evacuate, and were rescued from the water.

The terms of reference and the scope of the TSB Review excluded re-investigating the Norfolk Island ditching; rather, the Review was to focus on how the investigation was conducted.

This section provides an overview of the investigation from the initial notification of the occurrence to the publication of the final report. The information is structured around the main investigation phases set out in the ATSB's Safety Investigation Quality System (SIQS): notification and assessment, data collection (investigation), analysis, and reporting.

3.2 Notification and assessment

The ATSB was notified of the occurrence on the day it happened, 18 November 2009, and the decision to investigate it was made on the following day. A team of four was initially assigned to the investigation. The investigator in charge (IIC) had an aircraft operations background; the other three team members were specialists in engineering, aircraft maintenance, and human factors.

The team did not immediately deploy to the site because the aircraft wreckage was inaccessible and the passengers and crew were returning to Australia. Investigation activities within the first several days included

  • issuing protection orders to secure the wreckage and documents;
  • interviewing the captain;
  • liaising with the Australian Civil Aviation Safety Authority (CASA), the Norfolk Island Police, the Australian Bureau of Meteorology (BoM) for weather information, and Norfolk Island Unicom and Airservices Australia for audio recordings;
  • confirming accredited representatives for New Zealand, Fiji, and Israel;
  • assessing the need and capability for recovery of the cockpit voice recorder (CVR) and flight data recorder (FDR).

3.3 Data collection (November 2009 to September 2010)

3.3.1 Wreckage examination and recovery

On 29 November 2009, the team deployed to Norfolk Island to search for the wreckage, with the assistance of the Victoria Water Police. The wreckage was located, and the team returned to Australia on 02 December. The ATSB and Victoria Water Police did a video survey of the wreckage from 20 to 22 December.

Considerable research was conducted into the options and associated costs for recovering the flight recorders from the wreckage. By policy, military support was not available unless commercial options did not exist. It was determined that a portable decompression chamber would be required on site for any diving operation because of the depth of the water, which would have driven the cost of recovering the recorders to more than AUD$200 000. On 25 January 2010, the ATSB Chief Commissioner decided that this would not be an efficient or effective use of ATSB resources, given what was known about the circumstances of the ditching and the availability of other sources of data and information.

3.3.2 Witness interviews

The first interviews with the captain and first officer were conducted on 23 November and 02 December respectively. Additional ATSB interviews, telephone conversations, and email exchanges occurred with both the captain and first officer from December 2009 to August 2010. Some telephone contacts included briefings on the progress and direction of the investigation. On 19 January 2010, the pilots were interviewed during a "static reconstruction" of the event in another of the company's Westwind aircraft, with flight information and air traffic control (ATC) recordings from the Norfolk Island occurrence flight available to the crew.

In early December 2009, the Westwind fleet manager was interviewed, and another interview was conducted with the captain. The ATSB did not interview other company personnel during the investigation. The IIC briefed the company management on the issue of crews requesting weather observation updates en route but not requesting forecasts.

Interviews with the other aircraft occupants (the patient and the patient's spouse, the flight doctor and the flight nurse) were held in early December 2009. The IIC and the human factors specialist conducted most of the interviews.

The initial interviews with the flight crew explored the pilots' work schedules as well as sleep achieved during the layover in Apia. However, a comprehensive sleep–wake history going back at least 72 hours and to the last two adequate periods of restorative sleep was not immediately obtained.

3.3.3 Air traffic control recordings and weather information

Throughout December 2009, the ATSB requested and received audio recordings from the Unicom at Norfolk Island and from air navigation service providers in Australia, New Zealand, and Fiji.

From November 2009 to August 2010, as the investigation progressed, the ATSB requested and received several reports of meteorological information from the Australian Bureau of Meteorology. This included detailed wind and temperature information from FL340 to FL390 covering the entire flight path and time, provided on 14 December 2009, and low-altitude wind and temperature information for Norfolk Island from the surface to 800 hPa (approximately 6000 feet above sea level), provided on 31 March 2010.

3.3.4 Information on weather-related decision making

3.3.4.1 Survey of other operators

In January 2010, the IIC asked five operators using small jets on flights over water for the procedures they provided to crews to help them make weather-related decisions on long flights. A review of these operators' standard operating procedures (SOPs) found they did not include specific guidance related to the types of en route decision making that had been needed in the Norfolk Island occurrence. Instead, the operators relied on flight crews' experience and judgement.

3.3.4.2 Survey of students studying for the airline transport pilot licence

In January 2010, the IIC worked with instructors at an airline transport pilot licence (ATPL) training college to conduct an informal survey of ATPL candidates. The survey was intended to gauge the candidates’ understanding of what decisions were required on receipt of an en route update indicating that the weather for the arrival time at destination was forecast to be below alternate minima, but above landing minima. All of the candidates believed a diversion was warranted under these circumstances, but they were less certain about whether a diversion was legally required.

3.3.4.3 Survey of other crews

In September 2010, the IIC conducted an informal survey of pilots concerning weather and decision making on flights where no alternate aerodrome is required and no alternate fuel is carried. The sample for the survey was small and did not include Pel-Air pilots other than the occurrence crew. Participants were given a hypothetical situation involving a decision whether or not to continue to destination. They were asked what information they would gather and when, and were posed questions related to the weather criteria for a diversion. The IIC concluded that pilots did not use a consistent approach to gathering weather information and making decisions in these circumstances.

3.3.5 Release of the preliminary investigation report

On 13 January 2010, the ATSB released a seven-page preliminary investigation report.Footnote 11 The report provided extensive factual information, a discussion of the progress and direction of the investigation, and a statement that the feasibility of recovering the CVR and FDR was being assessed.

3.3.6 Information on regulatory oversight

Throughout the investigation, ATSB staff and management consulted or briefed CASA staff and management. Attachment A of the Memorandum of Understanding between the ATSB and CASA (October 2004) indicated that, upon agreement by both CASA and ATSB, a CASA officer might participate in the ATSB investigation. In this instance, no CASA officer was designated.

There was regular communication over the course of the investigation between the ATSB and CASA to share information and encourage safety action. Most of the briefings, which started right away in November 2009, were given by the ATSB to CASA on the scope and findings of the investigation. The briefings on the safety issue concerning guidance to pilots on en-route weather-related decision making took place in February 2010.

CASA had conducted a special audit of Pel-Air from 26 November to 16 December 2009, after the ditching. The IIC was concerned that reviewing the special audit report might bias the ATSB investigation, and so did not request a copy. The ATSB received a copy of the CASA special audit report in July 2012, during the DIP process.

On 28 July 2010, CASA briefed the ATSB on the findings of its regulatory investigation into the ditching, which it had done in parallel with the ATSB investigation.Footnote 12 The team leader obtained a copy of the CASA investigation report in March 2011.

An internal CASA audit report dated 01 August 2010Footnote 13 critically analyzed CASA's oversight of Pel-Air and its ability to oversee the wider industry. The ATSB had not known about this report during the investigation, and so it was not taken into account during decisions as to the scope of the investigation.

3.3.7 Investigation management in data collection phase

On 14 December 2009, the IIC provided a briefing to ATSB management that included an update on the progress and scope of the investigation, as well as an assessment of possible risks to the investigation. One issue raised at the briefing was the potential impact of the regulator conducting a parallel investigation, which was described in the briefing materials as possible role confusion and pressure to modify the scope of the ATSB investigation. This meeting was likely the source of miscommunication about how this situation would be handled. The IIC understood that the investigation should not cover the same areas as CASA, while ATSB managers believed it was clear that the ATSB investigation was fully independent. This misunderstanding persisted throughout the investigation, and as a result, only two ATSB interviews were conducted with managers and pilots at Pel-Air.

3.4 Analysis phase (February 2010 to October 2010)

By February of 2010, most of the data collection for the investigation was complete. At this point, the IIC was satisfied that a safety issue had been identified, namely the insufficient guidance given to flight crews on obtaining timely weather forecasts en route to help them make decisions when weather conditions at destination were deteriorating. The IIC drafted an analysis of the issue, including a risk assessment that categorized the issue as "critical." Because the evidence tables and risk analysis tools in the Safety Investigation Information Management System (SIIMS)Footnote 14 do not provide a version history, the TSB Review could not assess the use made of them during this period. On 01 February 2010, the team leader and the general manager (GM) decided to provide CASA with a briefing on the perceived safety issue.

The briefing was held by video conference on 03 February 2010. On 12 February, the primary contact at CASA followed up with a phone call to the IIC asking the ATSB to send a letter describing the safety issue.

Around this time, several meetings and discussions on the signing of a new memorandum of understanding (MOU) between CASA and the ATSB were documented. The MOU, which was signed on 09 February 2010, reflected the ATSB approach favouring proactive safety action over recommendations. By taking this approach, the ATSB aimed to track action taken by stakeholders on all issues determined to have a systemic impact (known as safety issues) and to issue recommendations only as a last resort when an unacceptable risk persisted.

CASA was reported to be supportive of this and to be anticipating that the Norfolk Island investigation would generate safety action. At the same time, CASA was clear that it intended to pursue regulatory action against the pilot. This was the subject of several e-mail exchanges between the IIC and the Chief Commissioner on the two agencies' differing perspectives and approaches.

Also in February 2010, an industry stakeholder contacted the GM suggesting that the possibility of fatigue should be considered in the investigation. This correspondence was forwarded to the IIC for consideration and review. The IIC in turn communicated with the human factors investigator about the possible impact of fatigue on the ability of a pilot to assimilate weather information during the flight. No fatigue analysis was prepared at this time.

On 26 February 2010, the ATSB sent a letter to CASA outlining the "critical" safety issue. The issue, based on the preliminary analysis, was described as a lack of regulation or guidance for pilots exposed to meteorological conditions that had not been forecast previously when on long flights to destinations with no nearby alternate aerodromes.

CASA sent a written response to the ATSB on 26 March 2010, agreeing that the then-current regulations were not prescriptive, but pointing out that weather-related decision making was part of all pilot training syllabi, beginning with the day-VFR (visual flight rules) syllabus. CASA expressed the view that the current published guidance material should allow pilots to make appropriate in-flight decisions, but said it was reviewing the regulations to determine whether amendments were needed.

By the beginning of May, an initial draft of the factual section of the investigation report had been compiled. On 18 May 2010, a critical investigation review was held and an analysis coach was appointed to help prepare the analysis section of the report. The ATSB's analysis framework, as well as its information management tools for recording the analysis, had recently been updated; the addition of a coach to teams had been successful in previous investigations in facilitating the use of the tools.Footnote 15

The analysis coach and the IIC worked together for a period of time, much of which was devoted to refining fuel calculations for the occurrence flight. Both the coach and the IIC found the process frustrating. The IIC felt that the coach's focus on the performance of the flight crew prevented the coach from seeing the systemic issues that the IIC considered important. The coach felt that insufficient data had been collected to identify systemic issues.

A progress meeting involving the IIC, the analysis coach, the team leader, and the GM took place on 27 July 2010. On 06 August, the coach, citing an inability to make progress on the analysis due to a lack of supporting data, asked to be, and ultimately was, removed from the investigation.

In the summer of 2010, there was a change in team leader responsible for the investigation: the original team leader was appointed to the position of assistant GM at the beginning of July, but remained involved in the Norfolk Island investigation until the handover to the new team leader was complete in mid-August. It was in the context of this handover that the inability to make progress with the analysis coaching arose. No action was taken to provide further assistance with the analysis.

The IIC continued to develop the analysis and draft report independently. A team meeting involving three of the original team members (the IIC, the human factors specialist, and the engineering specialist) was held on 09 August 2010, during which they reviewed the report findings and agreed on several amendments.

3.5 Report preparation (November 2010 to March 2012)

The IIC submitted the initial draft report to the team leader on 08 November 2010, when the team leader was deployed to a major investigation outside Australia that would continue to draw heavily on ATSB resources into late 2011.

On 12 November, the team leader assigned an investigator to conduct a peer review of the report, and it was completed on 03 December using the SIIMS format for providing peer reviewer comments. It identified concerns with the factual information presented, the safety factors analysis, the findings, and the readability of the report.

Around this time, the team leader assigned another investigator to undertake a second peer review of the report. The IIC was unaware that a second peer review had been requested. He responded to the original reviewer's comments, made changes to the report, and sent these revisions to the team leader on 10 December. On the same day, the second peer reviewer provided comments, in the form of an edited MS Word document, directly to the team leader.

There was no further action on the report until 17 February 2011, when the IIC sent a follow-up e-mail to the team leader. At this point, the IIC was still unaware that a second peer review had been undertaken and was waiting for the team leader's response to the modified report submitted in response to the first peer review.

On 18 February, the team leader forwarded the MS Word document containing the second peer reviewer's comments to the IIC and asked that they be considered.

The IIC made further revisions to the report in response. On 15 March, the IIC and the human factors investigator (the only remaining members of the investigation team) agreed that the report was ready for the team leader's review, and they submitted it the same day.

In March, April, and May of 2011, the IIC received and responded to multiple requests for updates from stakeholders, including the flight crew, the passengers, and the Bureau of Meteorology.

On 02 May, the IIC asked the team leader how to expedite the investigation. On 04 June, the team leader completed the review and returned the report to the IIC for response. The IIC made the requested modifications to the report and on 05 July, after another review by the team leader, the report was sent to the GM for review.

At this time, the IIC prepared and sent to CASA briefing sheets outlining two safety issues raised in the draft report: 1) fuel-management practices for long flights, and 2) Pel-Air crew training and oversight of flight planning for abnormal operations.

In preparation for a follow-up meeting with CASA, the draft report and supporting analysis were reviewed by an acting team leader who raised concerns to the GM about the adequacy of the data and analysis used to support the draft safety issues.

In response, the GM directed a third peer review by two operations (pilot) investigators who had not previously been involved in the investigation. They completed it on 11 August 2011, and provided six pages of comments, suggesting that the organizational issues identified in CASA's investigation report were significant and needed to be developed further in the ATSB report. The IIC reviewed the comments and provided a response to the GM on 05 September 2011.

During this period, the GM was working through a backlog of reports, many of which required considerable revision and editing. The GM was making these revisions himself, and he did the same for the draft Norfolk Island report. It was in the course of this work that the GM concluded that the available data did not substantiate classifying the safety issue on guidance for en-route weather-related decision making as significant, and modified the report to recast it as a minor safety issue. The report, with revisions and comments from the GM, was returned to the IIC, who reworked sections of the report and returned it for another review by the GM.

Internal ATSB communications during 2011 and early 2012 indicated a significant level of frustration with the lack of progress on the report and the extent of revision it required, given the number of reviews it had already undergone.

3.6 Report release (March to August 2012)

3.6.1 First DIP process

On 26 March 2012, the draft report was sent on the authority of the GM to directly involved parties (DIPs)Footnote 16 for comment. Consistent with ATSB policy, the Commission did not review the report before it was released to DIPs.

After several requests for extensions, submissions were received from all DIPs by 16 May 2012, and the DIP response process was started. The IIC responded to DIP submissions using the ATSB tool, and made modifications to the report in response to the comments. The IIC's responses were reviewed by the team leader and the GM in turn, with additional modifications at each stage. On 25 June 2012, the Commission began reviewing the draft investigation report.

As the responses to formal DIP submissions were being drafted, the ATSB received several enquiries from stakeholders about issues that were not in the report. These included the regulatory framework that permitted the occurrence flight to be conducted without an instrument flight rules (IFR) alternate aerodrome; the possibility of flight-crew fatigue; and the issues identified in CASA's special audit of Pel-Air.

On 24 May 2012, in response to an enquiry from the IIC, CASA provided an update on its plans to pursue regulatory changes that would no longer permit international aeromedical flights to be conducted under the category of aerial work, which meant that such flights would be required to identify an alternate aerodrome in the flight plan and carry alternate fuel.

Due to the significant revisions to the report, including a finding related to Pel‑Air's oversight of its operations that was added after the ATSB received the CASA special audit report, a second DIP review was started to give affected stakeholders (the captain, CASA, and Pel-Air) an opportunity to make additional comments.

3.6.2 Second DIP process

The revised draft report was released to DIPs on 16 July 2012, and submissions were received by 23 July. The IIC prepared the responses, which the team leader and the GM then reviewed. When, on 30 July 2012, the GM sent the revised draft report to the Commission for review, he remarked that the report was likely to result in significant media interest, and recommended that a media strategy be adopted.

In reviewing the report, the Commissioners expressed concern that there was insufficient factual information and analysis in the report to support the revised finding on the company's oversight of its operations. The Commission also wondered why the CASA special audit of Pel-Air had not been relied upon more extensively to support the findings.

These comments by the Commission did not result in changes to the report.

Ultimately, the finding related to Pel-Air's oversight of its operations was removed from the report, and on 17 August 2012, the report was approved for public release.

3.6.3 Public release and correction of factual errors

The report was released to the public on 30 August 2012. The following day, the ATSB was advised that the report contained a factual error with respect to the 0630Z METAR weather observation the pilot received. A correction was made to the report on the ATSB website, but the change was also incorrect. When it was advised of this second error, the ATSB corrected the METAR information again, and provided extra details to ensure readers understood that incorrect information had been transmitted to the crew with respect to the 0630Z METAR.

3.7 Analysis related to Norfolk Island investigation

3.7.1 Adequacy of Norfolk Island report

The function of the ATSB is to "improve safety and public confidence in the aviation, marine and rail modes of transport",Footnote 17 in part through the independent investigation of occurrences. The response to the Norfolk Island investigation report clearly shows that the investigation report published by the ATSB did not adequately address issues that the Australian aviation industry and members of the public expected to have been addressed, including

  • the regulatory framework that allowed this flight to be conducted as aerial work without an IFR alternate aerodrome;
  • the extent to which the flight planning and fuel management deficiencies observed in this occurrence extended throughout the company;
  • the adequacy of CASA oversight of Pel-Air;
  • the potential for flight crew fatigue and the adequacy of fatigue-management measures in place at Pel-Air;
  • issues surrounding egress and the survivability of crew and passengers.

This section of the analysis will focus on how the investigation report came to be wanting in these areas. Each element of the terms of reference for the TSB Review will be addressed: application of the investigation methodology; management and governance of the investigation; investigation report processes; and external communication.

3.7.2 Application of the investigation methodology

3.7.2.1 Data collection issues

The timely collection of occurrence data is critical to a thorough analysis and an efficient investigation. Although changes in scope or new avenues of inquiry frequently make it necessary to collect additional data as an investigation progresses, the perishable nature of some data or time constraints on the investigation may make this impossible.

In the Norfolk Island investigation, the analysis of specific safety issues, including fatigue, fuel management, and company and regulatory oversight, was hampered because insufficient data had been collected, as indicated in the following sections.

3.7.2.1.1 Fatigue

Analysis of human fatigue in transportation occurrences requires sufficient data to demonstrate whether a state of sleep‑related fatigue existed, and whether it played a role in the human behaviour observed in the occurrence. The former requires a comprehensive history of the quantity and quality of sleep going back at least 72 hours, preferably to the last two periods of restorative sleep. The latter requires the human performance observed to be compared with the known effects of fatigue on performance. An understanding of the level of risk associated with fatigue in an operation also requires information about the countermeasures in place.

In the Norfolk Island investigation, some sleep and rest data were obtained in initial crew interviews, but detailed information was not obtained for the period before the occurrence crew was paired. In analyzing the potential for sleep-related human fatigue, it is the quantity and quality of sleep obtained that is of critical interest, and it is problematic to assume that an individual who is not working is well rested. Further, efforts to conduct a fatigue analysis were hampered by the fact that, during different interviews, the flight crew gave different accounts of the amount of sleep obtained on the layover.

The ATSB Safety Investigation Quality System (SIQS) included a checklist intended to provide guidance for investigators when interviewing individuals who will be DIPs. The use of the checklist was not mandatory, and the introduction to the checklist reminded investigators that a checklist might not cover all the information that should be collected in such circumstances. The checklist for obtaining sufficient history for a fatigue analysis included the following points:

Fatigue/alertness

  • normal sleep pattern – any changes in last week
  • normal sleep quality (difficulties falling, staying asleep)
  • last sleep – start, finish, quality, any disturbances
  • 2nd last sleep – start, finish, quality, any disturbances
  • naps in last 48 hours
  • factors affecting sleep quality – medical, personal, environmental

Recent activities

  • describe work schedule last 3 days, anything unusual
  • describe how feeling at time
  • describe workload at time (physical, mental, time demands)
  • describe non-work activities on day; night before
  • recent exercise, exertions
  • what meals/food on day – when
  • what drank on day, when (coffee, tea, water)Footnote 18

It is not known whether the checklists were used in the crew interviews.

An analysis of fatigue data was not attempted until late in the investigation. By then, it was too late to address shortcomings in the available data. The report said that the flight crew were likely to have been fatigued on their arrival in Apia on the outbound flight due to prolonged wakefulness, but concluded that there were insufficient data to determine whether the crew were fatigued at the time of the occurrence or whether fatigue played a role in the crew's understanding of weather information received en route. The adequacy of the operator's fatigue-management measures was not addressed in the report.

3.7.2.1.2 Fuel management

The investigation examined fuel planning and in-flight decision making. Although the flight was not required to carry fuel for an alternate aerodrome, it was required to carry fuel for contingency operations. On page 39, the report states, "the flight crew departed Apia with less fuel than required to safely complete the flight in case of one engine inoperative or depressurised operations from the least favourable position during the flight."Footnote 19 However, the report does not specify how much fuel would have been required for these contingencies.

The meteorological information collected would have supported calculation of expected aircraft cruise and arrival fuel performance for comparison with the occurrence flight. However, there was little information on low- and mid-level winds, preventing investigative calculations of expected fuel performance for one‑engine-inoperative or depressurized operation. Being able to calculate the occurrence flight fuel requirements for contingency operations would have helped investigators better understand the occurrence crew's in-flight decision making.

The IIC's perspective was that he should not duplicate work being done by CASA, including interviews with other Pel-Air pilots. Consequently, the interviews did not take place, and the investigation could draw no conclusions as to whether the fuel planning and management practices on the occurrence flight were used by other Pel-Air pilots.

Three strategies had been employed to gather data to determine whether flight planning and fuel-management deficiencies observed in the occurrence were widespread. The first consisted of a sample of operators of light jets over water, who were asked to provide copies of the procedures provided to help crews make weather-related decisions on long flights. The second was a consultation with a pilot training college that provided information on ATPL candidates' understanding of the need to divert if weather was forecast to be below alternate minima at the destination aerodrome. The third consisted of responses from a sample of pilots who were given a hypothetical flight situation and asked what weather information they would obtain, when they would ask for it, and the criteria for diverting to an alternate aerodrome.

The survey of operators, while not helpful in determining how widespread the practices observed in the occurrence were within Pel-Air, did provide a baseline against which to compare Pel-Air's procedures in this area, and was relied upon extensively in communicating the safety issue to CASA in February 2010.

It is not clear whether the survey of ATPL candidates was derived from a deficiency identified in the investigation of the Norfolk Island occurrence. Nonetheless, because of the small size of the sample and the hypothetical nature of the questions posed in the survey, the data obtained did not effectively support the argument that a systemic deficiency existed.

3.7.2.1.3 Company and regulatory oversight

In this investigation, very little of the data collected documented the actions taken by the regulator to oversee Pel-Air's operations or the actions taken by Pel-Air itself. Data collection at Pel-Air consisted of interviews with the occurrence crew and the Westwind fleet manager and a review of documents Pel-Air had provided. Investigators had not interviewed additional Pel-Air crews to determine the extent to which the flight planning and fuel monitoring deficiencies observed in the occurrence existed throughout the company, and only one management interview had been conducted over the course of the investigation.

Similarly, no interviews were held with CASA operations inspectors who were familiar with the operation and oversight of Pel-Air, and several key documents, including the CASA special audit of Pel-Air, were not obtained until very late in the investigation.

The lack of data in these areas was felt throughout the investigation. Two examples of this are the removal of a finding with respect to Pel-Air's oversight of its aeromedical operations,Footnote 20 and the lack of any analysis of CASA's oversight of Pel-Air.

The reasons for inadequate data collection in these areas will be discussed below in the section on governance of the investigation.

3.7.2.2 Application of analysis tools

The ATSB Safety Investigation Guidelines Manual notes that analysis provides the link between collected data and investigation findings. It identifies four keys to an effective analysis, namely the use of well-defined concepts; a structured approach or process; a team-based approach; and knowledge of the domain being investigated.Footnote 21

The ATSB's Safety Investigation Quality System (SIQS) laid out well-defined and accepted concepts as well as a structured approach for investigations. It included events diagrams, all-factor maps based on the elements of the James Reason model of accident causation, and evidence tables that provided tests for the existence and influence of potential safety factors. As noted elsewhere in this report, the ATSB analysis tools, when used effectively, provided a structure for organizing the arguments to validate safety factors and a useful framework for encouraging a team approach to analysis.

The guidelines on analysis provided to ATSB investigators emphasized that analysis was an iterative process that might trigger additional data collection:

Analysis is an iterative process, interacting with the other major tasks involved in an investigation. Analysis may lead to a requirement to collect further data, which then needs to be analysed. During report preparation, we may identify a need to conduct further analysis, which may change the content of the final report.Footnote 22

Shortly before the Norfolk Island occurrence, the ATSB had integrated its suite of analysis tools into its information management system. This was part of an effort to improve the consistency with which the tools were used in investigations, and to address the considerable variability in the way the investigators and managers had been observed to use them.

There were a number of challenges to overcome to achieve this, including

  • a low level of comfort with the investigation tools among some investigators who were required to use them infrequently;
  • the perception by some investigators that the analysis tools were an unnecessary administrative burden; and
  • the tendency of some investigators to conduct analysis on their own rather than employ a team to test their thinking.

Some of these challenges were observed during the review of the Norfolk Island investigation.

  • The "critical" safety issue that the IIC had identified (the insufficient guidance given to flight crews on obtaining timely weather forecasts en route to help them make decisions when weather conditions at destination were deteriorating) was communicated to CASA in February 2010, but
    • the issues with data collection described above meant that the information available to support the identification of a critical safety issue was weak;
    • the description of the safety issue had been prepared by the IIC in isolation and recorded in MS Word documents and e-mails rather than in SIIMS;
    • the underlying analysis and supporting information had not been reviewed before the safety issue was communicated to CASA;
    • the description of the safety issue was reviewed as text, not against the ATSB's analysis tools, which meant that weaknesses in the material supporting the safety issue's definition as critical were not detected; and
    • an opportunity to emphasize the importance of using the tools early in the investigation was missed.
  • The analysis coaching provided during the Norfolk Island investigation was intended to help ease the transition to, and improve the use of, the analysis tools. Attempts were made to create a team approach, and to structure the analysis outward from the occurrence events. However, the team became bogged down by differences in perspectives on the validity of the data available to support the safety issues that had been identified previously. Consequently, the coaching broke down and the data quality issues were not resolved.
  • Multiple peer reviews of the report identified weaknesses in the data supporting the safety issue described in it and questioned the validity of the analysis. These concerns were shared by the GM. The GM worked with the IIC to refine the draft safety issues, but the approach was to revise the draft report rather than perfecting the analysis using the analysis tools and then modifying the draft report based on the agreed-upon analysis.

The above examples indicate that the analysis tools were not effectively used in this investigation to

  • systematically work outward from the occurrence events to underlying organizational factors;
  • identify areas in which additional data collection was necessary;
  • structure a team-based approach to analysis; or
  • review safety issues or the draft report.

Weaknesses in the application of the ATSB analysis framework resulted in data insufficiencies not being addressed and potential systemic oversight issues not being analyzed.

3.7.2.2.1 Lack of a tool for conducting fatigue analysis

The ATSB has a number of excellent tools to provide investigators with a structure for analyzing safety factors and issues, but it does not use a specific tool to guide data collection and analysis in the area of human fatigue.

The TSB has developed a guide for investigators on this topic. While the suggested approach to analyzing fatigue uses the same tests for existence and influence as in the ATSB's analysis frameworks (i.e., one test to demonstrate that the condition existed and a second to demonstrate that it played a role in the occurrence), the TSB guide adds value by providing specific science‑based criteria for use in making fatigue-related determinations.

The guide also supports data collection in the areas of sleep-related fatigue by providing a tool for capturing a good sleep–wake history and by providing background information for investigators so they understand the nature of fatigue and why the data are important.

3.7.2.2.2 Risk analysis

The ATSB Safety Investigation Policy and Procedures Manual states that "a risk analysis will be conducted prior to initiating any ATSB safety recommendation or safety advisory notice."Footnote 23 The purpose of the risk assessment is to determine whether the safety issue carries a risk level that warrants corrective action by another organization.Footnote 24

Understanding, and being able to describe, the risk associated with a safety issue is a crucial step to being able to make a compelling argument for change. If an investigative body is to convince stakeholders that action is required, it must be able to demonstrate that the consequences of maintaining the status quo are sufficiently costly to warrant devoting resources to change it. A risk analysis provides the grounds for just this argument. As the Norfolk Island investigation shows, however, these arguments can be undermined if there is too much focus on the labels associated with a risk assessment.

A safety issue labelled "critical" is associated with an intolerable level of risk and is immediately communicated to a relevant organization. If the consequent safety action is insufficient to reduce the risk level to below critical, then a safety recommendation may be issued as soon as possible.Footnote 25 In contrast, "significant" and "minor" safety issues are associated with lower levels of risk, and other vehicles—such as recommendations made on release of the final report, or safety advisory notices—will be used to communicate them with less urgency.

The safety issue communicated to CASA in February of 2010 related to the insufficient guidance given to flight crews on obtaining timely weather forecasts en route to help them make decisions when weather conditions at destination were deteriorating. When the safety issue was presented to CASA, it was categorized as "critical", but in the final report it was described as "minor", which caused significant concern among stakeholders.

The risk label applied to a safety issue is an output of the ATSB's risk analysis process, which involves the following steps:

  • Describe the worst possible scenario.
  • Review existing risk controls.
  • Describe the worst credible scenario.
  • Determine the consequences associated with the worst credible scenario.
  • Determine the likelihood of the worst credible scenario.
  • Estimate the level of risk (using consequences and likelihood).

Using the worst credible scenario is important because it accounts for the risk controls that are already in place to defend the system. A risk assessment based on the unlikely failure of all the defences in place may result in an argument for change that is alarmist and unconvincing.

The guidelines the ATSB provides its investigators acknowledge that making these assessments can be somewhat problematic, and suggest that differences in perspectives among team members be discussed and resolved by establishing a range of possible risk ratings.Footnote 26 This is another example where the analysis guidelines stress the value of a team approach to conducting analysis, since understanding different perspectives will result in a more compelling argument for change.

In the Norfolk Island investigation, the change in the risk level assigned to the safety issue of guidance provided to crew members resulted from the application of different "worst credible" scenarios. A scenario resulting in the loss of an aircraft supports an argument for a catastrophic outcome, while a scenario involving an aircraft diverting supports an argument for a lower level of consequence. Which scenario is more plausible depends on how the issue is framed in the analysis and how the adequacy of the risk-control measures in place is assessed. The determination of which risk-control rating would be correct is beyond the scope of this analysis. However, the fact that the ATSB did not find it necessary to issue a recommendation on this safety issue immediately indicates that initially describing it as "critical" was excessive.

The important point here is that the use of the risk label did little to improve understanding of the safety issue being communicated in the ATSB report. The safety action taken by CASA and the operator was not a result of the label, but rather of the analysis of the safety issue itself. The level-of-risk label did not contribute to advancing safety; instead, it focused discussion on the label rather than on the identified issue and the potential means of its mitigation.

In the period since the Norfolk Island investigation, the ATSB has stopped publishing risk labels for safety issues.

3.7.3 Management and governance of the investigation

3.7.3.1 Misunderstanding of the roles of ATSB and CASA

Independence is critical to the work of an accident investigation body whose sole mandate is to improve safety. Parallel investigations by other agencies to fulfill their respective mandates should have no bearing on the actions of an independent safety investigation. In the Norfolk Island investigation, there was no real barrier to prevent any avenue of investigation or examination of the regulatory process itself. However, there was a misunderstanding that affected the quantity and quality of the data available for analysis.

The Norfolk Island ditching occurred four months after structural changes made the ATSB fully independent: on 01 July 2009, the ATSB had ceased being a division within the Department of Infrastructure, Transport, Regional Development and Local Government, and became a separate statutory agency. These changes should have had the effect of reducing the likelihood of any influence by CASA on an ATSB investigation.

However, at a December 2009 progress briefing to ATSB management, one of the risks to the investigation discussed was the CASA parallel accident investigation. Afterward, perceptions of how this issue had been resolved differed. The IIC believed he had been instructed not to cover the same areas as CASA, since the regulator was conducting a parallel investigation. Meanwhile, the Commission and ATSB managers believed it was well understood that the investigations were fully independent and that there were no barriers to the ATSB investigation.

In addition, there were several communications between the IIC and the Chief Commissioner on the issue of regulatory action planned by CASA against the pilot. These communications did not clarify the independence of the ATSB investigation, and the IIC continued to believe that he had been instructed to avoid duplicating CASA's efforts.

The IIC's misunderstanding of the roles of CASA and the ATSB was never resolved. It resulted in information not being collected from Pel-Air to determine the extent to which the flight planning and monitoring deficiencies observed in the occurrence prevailed in the company in general.

Oversight of investigation

Accident investigation bodies are entrusted with conducting comprehensive, impartial investigations into transportation occurrences in order to improve safety; staff complete the investigation work on behalf of the agency. There should be several layers of oversight and multiple processes to ensure that the output of the investigation is rigorous and defensible.

At the ATSB, the levels of oversight above the IIC are the team leader, the GM and the Commission. Quality-control processes, in addition to regular supervision by the team leader, included analysis coaching, critical investigation reviews, and peer reviews.

Although the processes in place provided multiple opportunities to address problems with the data quality and analysis, as described above, the lack of effective communication and weaknesses in the follow-up meant that certain issues remained unaddressed throughout the investigation:

  • Critical reviews were conducted periodically during the investigation but did not identify specific shortcomings in the data collection or analysis. The TSB Review team repeatedly heard that the critical review process was a valuable opportunity to discuss how to manage investigation risks, but that it was high level and unlikely to identify shortcomings related to data sufficiency or analytical rigour.
  • The GM and the team leader relied upon the expertise and judgment of the IIC when communicating the "critical" safety issue to CASA, and did not review the underlying analysis.
  • The breakdown in the analysis coaching did not prompt any action on the part of the team leader to explore the reasons for it, or to address weakness in the quality of data or analysis.
  • Multiple peer reviews were commissioned over the course of the investigation. The IIC appears to have been unaware of the latter two peer reviews until they were complete, indicating ineffective communication between the IIC, the team leader and the GM. Although the IIC responded to each peer review, he was working in isolation, and it was left to him to accept or reject the peer reviewers' input. In the end, a number of critical points raised by the reviewers were not addressed and did not result in additional data collection or analysis.
  • The report passed the team leader's review, but the GM found it to be analytically weak. Rather than returning it to the team leader or IIC, the GM personally edited the report so that the analysis was supported by the available information.

Attempts to address problems with the quality of the analysis relied on peer-centred processes rather than on direct intervention by the team leader. In addition, few steps were taken to ensure that these attempts resulted in adequate analytical support for the report findings. This approach was evident at all levels, as indicated by the GM's decision to revise the report himself. Ultimately, ineffective oversight of the investigation resulted in issues with data collection and analysis not being identified or resolved in a timely way.

A number of factors underlying this indirect approach were identified to the TSB Review team.

  • There was a change of team leader, and some of the critical events, such as the breakdown of the analysis coaching, took place in the transition period.
  • During the Norfolk Island investigation, the ATSB was undertaking an atypical number of high-profile level-2 investigations that were consuming significant resources and attention. For example, the second team leader was deployed to a major investigation outside the country as the IIC was completing the initial draft of the Norfolk Island investigation.
  • There was a backlog of investigation reports, and the GM was trying to deal with it by editing reports himself, while at the same time as addressing the issues of analysis and team oversight.

3.8 Investigation report processes

3.8.1 Peer review process

The ATSB's Safety Investigation Guidelines Manual - Reporting describes the peer review process as crucial to the credibility of the ATSB, and states, "The reviewer shares responsibility with the IIC and the relevant Manager for the quality of any reviewed report and for aspects of the investigation's documented history upon which the report relies."Footnote 27 The guidelines go on to state that the peer reviewer should ensure that all steps in the investigation process have been taken; that the findings are relevant and supported; that no important issues have been overlooked; and that the report is fit for publication.

Checklists for the use of peer reviewers are provided in the ATSB's Safety Investigation Information Management System (SIIMS). There is also a Web-based form in which reviewers can record their comments and the IIC can record his or her responses to the peer reviewers' comments.

In the multiple peer reviews of the Norfolk Island investigation, there was some variability in the extent to which the checklists and Web-based forms were used and the underlying data and analysis were reviewed.

Although the IIC responded to each peer review, some of the issues raised in them were not addressed. It was normally the practice for team leaders to review the IIC's response to peer review comments, but it was not effectively applied in the Norfolk Island investigation, due to workload.

Ultimately, the value of the peer review process is only as good as the response to the feedback and the oversight by managers. In this investigation, there was no second-level review, which resulted in most of the peer reviewers' input being left out of the report. This limitation in the ATSB's peer review process means that, barring an ex-process intervention, there is a risk that improvements to the analysis and conclusions will not be incorporated in an investigation report.

3.8.2 DIP process

By inviting involved parties to make comments related to factual inaccuracies or omissions in the report, the DIP process provides an important opportunity to improve the accuracy of the report before it is released to the public. The TSB Review team noted a number of weaknesses with the current DIP process.

There were two DIP processes in the Norfolk Island investigation. Although each DIP submission was reviewed in depth, the IIC did it in isolation. In both instances the IIC used the Web‑based forms in SIIMS to record the responses to the DIP comments and indicate whether changes were made to the report.

The SIIMS forms often included multiple comments in a single entry labelled "partially accepted", making it difficult for any second-level reviewer to determine whether each issue had been addressed.

Unlike the TSB's process, in which the Board responds to each comment submitted by a designated reviewer, the ATSB does not provide a written response to DIPs who made submissions. Consequently, DIPs may be left with little understanding of the response to their submissions, which risks diminishing their acceptance of the report.

Ultimately, the lack of a process for the Commission to review the DIP responses, ensure the DIP comments were addressed, and provide the DIPs with feedback reduced the effectiveness of the DIP process in improving the quality of the Norfolk Island report.

3.8.3 Review by the Commission

The ATSB Commission includes the chief commissioner and two part-time commissioners. The chief commissioner is also the CEO, and is located in Canberra. Currently, the two part-time members live outside Canberra. Much of the communication among the three commissioners is electronic, with periodic face-to-face meetings. Only the Commission can approve the release of an investigation report, and the Commission is ultimately responsible for the quality of a report.

In the case of the Norfolk Island investigation, the Commission did not review the report until the first DIP process had been completed. This is consistent with ATSB procedures in that the Commission reviews only selected investigations before the DIP process. This is an area in which the ATSB differs from the TSB: at the TSB, the Board must approve all confidential draft reports for release to the reviewers.

When the Commission reviewed the report in June and July 2012, the commissioners expressed concern that there was insufficient factual information and analysis in the report to support a finding related to oversight of aeromedical operations by Pel-Air. The Commission was also concerned that the CASA special audit of Pel-Air was not relied upon more extensively.

The Commission's review of the report took place immediately after the first DIP process was completed, 31 months after the occurrence. At this stage in an investigation, it is difficult to address issues of insufficient factual information since perishable information will not be available and the collection of other information could incur substantial delays. It is for this reason that issues that had not been dealt with in the report, such as the possibility of crew fatigue, could not be addressed at this time.

The fact that the Commission does not formally review some reports until close to the end of the investigation, after the DIP process is complete, increases the risk that issues with the scope of the investigation and the quality of the report will be identified too late in the process to be resolved.

Communications among commissioners indicated that there was concern with the lack of analysis of the adequacy of company and regulatory oversight, especially in light of the CASA special audit report. However, this concern did not result in changes to the report.

Although the Commission can use the Web-based report-review form in SIIMS —the one used at other levels of review—to record their comments for the IIC's response, the Commission does not use it routinely. One result of this process gap was the absence of any follow-up investigative work by staff or changes to the report stemming from the Commission's concern about company and regulatory oversight.

The lack of a robustly documented feedback process after the Commission review increases the risk that issues with the scope of the investigation and quality of the report will not be resolved.

3.9 External communications

Clear communication with stakeholders and the public is critical to the ATSB's credibility and its ability to encourage safety action.

Throughout the Norfolk Island investigation, in accordance with the ATSB methodology, there was regular communication with stakeholders regarding factual information and potential safety issues.

Before the release of the final report, communications between the GM and the Commission had indicated the potential for significant public interest. However, although senior managers were aware of the possibility that the report would generate some controversy when it was released, communications staff were not consulted and no communications plan was developed to address a controversy if one were to arise.

After the release of the report, senior ATSB managers were aware of a broadcaster's intention to do a segment featuring the investigation, but were under the impression that it would focus on the accident itself and the safety issues raised in the report; they were unprepared for the events that followed.

Once the investigation became the subject of an external inquiry, the ATSB did not comment publicly on the report.

3.10 Findings from the TSB review of the Norfolk Island investigation

  1. The response to the Norfolk Island investigation report clearly demonstrated that the investigation report published by the ATSB did not address key issues in the way that the Australian aviation industry and members of the public expected.
  2. In the Norfolk Island investigation, the analysis of specific safety issues including fatigue, fuel management, and company and regulatory oversight was not effective because insufficient data were collected.
  3. The ATSB does not use a specific tool to guide data collection and analysis in the area of human fatigue.
  4. Weaknesses in the application of the ATSB analysis framework resulted in data insufficiencies not being addressed and potential systemic oversight issues not being analyzed.
  5. The use of level-of-risk labels when communicating safety issues did not contribute to advancing safety, and focused discussion on the label rather than on the identified issue and the potential means of its mitigation.
  6. A misunderstanding early in the investigation regarding the responsibilities of CASA and the ATSB was never resolved. As a result, the ATSB did not collect sufficient information from Pel-Air to determine the extent to which the flight planning and monitoring deficiencies observed in the occurrence existed in the company in general.
  7. Ineffective oversight of the investigation resulted in issues with data collection and analysis not being identified or resolved in a timely way.
  8. The lack of a second-level peer review in the Norfolk Island investigation meant that improvements to the analysis and conclusions stemming from the peer review were not incorporated into the report.
  9. At the ATSB, the Commission does not formally review some reports until after the DIP process is complete. This increases the risk that issues with the scope of the investigation and the quality of the report will be identified too late in the process to be resolved.
  10. The lack of a robustly documented feedback process after the Commission review increases the risk that issues with the scope of the investigation and the quality of the report will not be addressed.
  11. Ultimately, the lack of a process for the Commission to review the DIP responses, ensure the DIP comments were addressed, and provide DIPs feedback reduced the effectiveness of the DIP process in improving the quality of the Norfolk Island report.
  12. Although senior managers were aware of the possibility that the report would generate some controversy, communications staff were not consulted and no communications plan was developed.
  13. Once the investigation became the subject of an external inquiry, the ATSB could no longer comment publicly on the report, which hampered the Bureau's ability to defend its reputation.

4.0 Review of the Kangaroo Valley investigation

4.1 Factual information related to the Kangaroo Valley investigation

On 24 December 2011, an Agusta Westland AW139 helicopter was engaged in a winching operation to rescue a seriously injured rock climber, who had fallen and was trapped on a rock ledge in Budderoo National Park. During the winching operation, and while attached to the winch cable, the patient and the duty paramedic were pulled from the ledge and struck rocks at the bottom of the waterfall. The duty paramedic survived the initial impact but later succumbed to his injuries. The patient was subsequently evacuated overland and was transported to hospital for treatment.

The following issues were examined in the course of the investigation:

  • Planning for retrieval operation
  • Execution of retrieval operation
  • Crew decision making related to pending nightfall
  • Communication between the helicopter and ground
  • Communication within the helicopter
  • Rope adaptations to stabilize hoist cable
  • Illumination systems design and usage
  • Assessment of possible tampering with evidence (ropes at site)
  • Organizational influences
  • Training
  • Site accessibility for investigators (ATSB risk management).

4.1.1 Notification and assessment

On 24 December 2011, the ATSB received initial notification of the accident, which indicated that minor injuries had been incurred. Early the following morning, updated information was provided that indicated the accident involved fatal injuries, and the decision was taken to investigate.

The occurrence was initially classified as a Level 4 investigation. A Level 4 investigation is a less complex investigation than those classified 1, 2 or 3, involves only one or two ATSB resources and/or requires up to three months to complete. In some cases, after the initial in-the-field activity, the investigation level may be changed or the investigation discontinued if it is determined that there is no safety value to be gained from continuing the investigation.

The investigation was reclassified to Level 3 on 08 March 2012. A Level 3 investigation is a less complex investigation that involves only one or two ATSB resources and/or requires up to 9 months to complete, and may not always require in-the-field activity.

A team of three investigators was assembled. The IIC had a helicopter operations background and the two additional team members were specialists in materials engineering and aircraft maintenance. Four investigators with other specialties worked on portions of this investigation at various times.

A media release was issued to the Australian Associated Press on 25 December stating that the ATSB would be conducting an investigation. The ATSB had taken steps to notify stakeholders of the investigation and coordinate with the operator on the content of the media release before publishing it.

4.1.2 Data collection (December 2011 to August 2012)

4.1.2.1 Site and aircraft examination

The team did not immediately travel to the occurrence location. Given the difficulty of accessing the site, the team travelled to a location where the aircraft could be examined and initial interviews could be conducted.

Although there was no wreckage to be examined at the site, an understanding of the topography was necessary for the investigation. In addition, equipment and rope arrangements that needed to be examined remained at the accident site. Initial data collected in early January consisted of photos and videos taken by first responders and imagery from the Ambulance Service of New South Wales. The ATSB also took custody of the cockpit voice recorder (CVR) and the flight data recorder (FDR) from the helicopter.

Although it was initially decided that the costs and risks associated with the team going to the occurrence site outweighed the benefits, questions arose during the investigation that required an on-site examination. The ATSB conducted a risk assessment in mid-February and decided to deploy. The investigation team went to the occurrence site on 20 February and 08 March 2012, on both occasions, accessing it on foot

4.1.2.2 Witness interviews

The IIC and a technical investigator conducted the initial interviews with the pilot, the helicopter crewman, the support paramedic and the injured rock climber on 25 and 27 December, and audio-recorded them. In late January, after additional imagery and information about the ropes, equipment, and techniques used during the rescue were obtained, the team met with the ambulance service to better understand the procedures for rope usage. On 31 January, the team visited the helicopter operator to review the role the rescue equipment played in the occurrence.

On 13 January 2012, an interview was conducted with a doctor who had been a crew member in the helicopter, but who had deplaned prior to the winching operation.

Additional interviews were conducted with the pilot, the crewman, the support paramedic, employees of the National Parks and Wildlife Service, and the owner of the adjacent property throughout February and March 2012.

4.1.2.3 Weather information

Preliminary analysis indicated that the onset of darkness was the only environmental factor in the accident; consequently, collection of other routine weather information was deferred until other higher-priority investigative tasks had been completed. On 09 February 2012, investigators requested weather information from the Bureau of Meteorology (BoM). The BoM prepared a detailed meteorological report and provided it to the ATSB on 25 June 2012.

4.1.2.4 Rescue information

The rescue operation was initiated when the injured climber’s companion activatied a 406 MHz personal locator beacon. The ATSB requested information from the Australian Maritime Safety Authority (AMSA) about the rescue response generated by the beacon. On 08 February 2012, AMSA gave the ATSB a report on the rescue.

4.1.2.5 Survivability information

Questions about the response time to evacuate the injured paramedic arose during the data collection phase, and were discussed with the pathologist who had conducted the autopsy. A trauma surgeon was asked to provide an expert opinion on the possibility of the injured paramedic’s survival if immediate evacuation to a trauma centre had been feasible. The trauma surgeon’s report was received on 30 August 2012.

4.1.2.6 Investigation management during data collection phase

The IIC provided a post field-phase briefing to ATSB management on 12 January 2012, where it was decided that the team would not attend the occurrence site due to the difficult access. (This decision was revisited in mid-February because of the need to gather information to answer questions raised during the investigation.) At this briefing, the need for Human Factors support for the investigation was discussed, and resources were made available to the IIC as required. It was also decided that a preliminary investigation report would be issued.

The IIC made initial contact with the next of kin in early January, and advised them in advance of its posting to the ATSB website on 17 January 2012 that the preliminary report was being released. Further briefings were provided to the next of kin throughout the data collection phase.

A request from the next of kin to listen to the CVR was considered and subsequently denied.

The team briefed the helicopter operator in early January regarding the cable angle employed during the rescue. The operator and the ambulance service received an update on the investigation on 02 February 2012 after the available CVR and FDR data were reviewed, and they were sent a draft factual report on 17 February 2012 to facilitate early safety action. FDR information was released to the helicopter operator on 09 March 2012 under Section 62 of the Transport Safety Investigations Act (TSI Act).

The ambulance service submitted a freedom of information request for the CVR info on 14 May 2012. On 07 June 2012, after discussions between the ATSB and the ambulance service, the freedom of information application was voluntarily withdrawn.

Stakeholders raised concerns in late January about possible tampering with physical evidence at the occurrence site. These concerns contributed to revising the decision to visit the site, once a risk assessment had been made.

4.1.3 Analysis phase (August 2012 to November 2012)

4.1.3.1 Use of analysis tools

In keeping with the IIC’s practice, evidence tables in the ATSB’s Safety Investigation Information Management System (SIIMS) were used extensively. The evidence tables were reviewed with the investigation team and completed prior to the initial drafting of the investigation report.

4.1.3.2 Investigation management during analysis phase

A critical review was conducted on 13 August 2012, during which the IIC briefed ATSB management on the progress of the investigation, the factual information, and the initial analysis. A decision was made to adjust the scope of the investigation to collect additional information from the helicopter operator and ambulance service regarding organizational issues that might have contributed to the occurrence.

The IIC provided two investigation updates to the next of kin in September. A further update was provided by the GM in early November.

4.1.3.3 Additional data collection during analysis phase

During the analysis phase, follow-up interviews were conducted with the surviving paramedic, the pilot in command, and the helicopter crewman, and meetings were held with the helicopter operator and the ambulance service to discuss possible organizational issues. Additional information or clarifications were sought on the helicopter illumination system, captain flight times, paramedic recurrent training and certification, and aircraft maintenance and airworthiness as well as the results of internal investigations carried out by the ambulance service.

4.1.4 Report preparation (November 2012 to May 2013)

The team leader assigned a human factors investigator as a peer reviewer to the investigation. The selection of peer reviewer was based on availability and workload, rather than a need for specialist review. Although not normally done, report sections were reviewed as they were drafted to enable the report to be sent to DIPs by the target date in December. The investigation team and the peer reviews were completed on 03 December 2012.

The team leader’s review of the report was also completed on 03 December. The team leader’s practice was to review the draft report in isolation, without reviewing the evidence tables that supported the analysis.

The draft report was forwarded for review by the GM and the Commission, and was released by the Commission to DIPs and parties with an involvement (PWIs) on 15 December 2012.

Following receipt of DIP submissions, some additional data collection was carried out to clarify points raised. A follow-up interview was conducted with the pilot-in-command, and the trauma surgeon who had initially advised on the report was consulted again.

The IIC updated the draft report based on the DIP submissions, and the investigation team reviewed the final report on 15 March 2013. The team leader reviewed the final report on 17 March, and the report was forwarded to the Commission for review.

Between the completion of the final report and the report release, additional information was received from the helicopter operator and ambulance service, which was considered for inclusion in the report. Three days before the report’s public release, a further submission was received from the helicopter operator and reviewed by the Commission.

4.1.4.1 Investigation management during the report preparation phase

Investigation updates were provided to the next of kin and the helicopter crewman in early December. The IIC gave an additional briefing to the next of kin on 17 December 2012 after the report was released to the DIPs and PWIs, as well as an investigation update on 12 March 2013.

The helicopter operator and the ambulance service were briefed during the DIP process to clarify information in the draft report.

At the request of the coroner, a briefing on the draft report was given on 04 March 2013.

4.1.5 Report release (May 2013)

The report was released on 16 May 2013 and included four minor safety issues, the responses to which are available on the ATSB website.Footnote 28

No investigation closure briefing was conducted due to the IIC’s workload.

4.2 Analysis related to the Kangaroo Valley investigation

The purpose of including additional investigations in the TSB Review was to provide a means of comparison to the Norfolk Island investigation and an additional perspective on the utility and application of the ATSB’s analysis and investigation management processes. To that end, this section of the report focuses on an analysis of areas where significant similarities or differences were observed in the two investigations. As in the Norfolk Island analysis, this is structured around the main areas of inquiry included in the terms of reference for the TSB Review.

4.2.1 Application of the investigation methodology

As described in the discussion of the Norfolk Island investigation, the ATSB investigation guidelines identify four keys to an effective analysis, namely the use of well-defined concepts; a structured approach or process; a team-based approach; and knowledge of the domain being investigated.Footnote 29

The Kangaroo Valley analysis required the evaluation of a number of possible scenarios for the sequence of events. It then involved the analysis of the role of a number of operational conditions present at the time of the occurrence, and the organizational issues that led to those conditions.

By way of example, the analysis considered a number of possible scenarios with respect to the duty paramedic’s plan for the use of a stabilizing rope, and ultimately concluded that there was no evidence to indicate that the planned stabilizing rope system had been established when the paramedic and patient were pulled from the cliff. The report included analysis to support the conclusion that the stabilizing rope would not have prevented the fall. The report also analyzes the factors that contributed to paramedics using adapted procedures that had not been documented or in which they had not been trained. This became one of the four minor safety issues elaborated on in the report.

The analysis of these and other issues in the investigation relied heavily on the use of evidence tables before the report was drafted. This allowed the available evidence for the different scenarios for rope use to be laid out and reviewed by the investigation team. It also allowed further data collection to be planned during the analysis phase to validate factors that were hypothesized to have played a role in the occurrence, further strengthening the analysis. Completion of the evidence tables and validation of the arguments with the investigation team greatly expedited the report writing process.

Overall, the Kangaroo Valley investigation is a good demonstration of the potential of the analysis tools to structure thinking, test hypotheses, provide a basis for team discussion, assist in identifying the need for additional data, and expedite report writing.

4.3 Management and governance of the investigation

4.3.1 Timeliness

ATSB guidelines set a target of nine months to complete an investigation of this scope. At just under 17 months from occurrence to report release, this was the shortest of the three investigations examined by the TSB peer review.

The investigation team was very conscious of timeliness throughout the investigation and had aimed to provide a draft report to DIPs before the one-year anniversary of the occurrence. The team felt this was particularly important given that the occurrence happened at Christmastime and they understood that the holidays combined with the anniversary of the occurrence would be a particularly difficult period for the next of kin.

Steps were taken to expedite the report preparation process to meet this goal. For example, the peer review was conducted in parallel with report writing. Although this was unusual, it was a mutually agreed-upon approach that enabled the investigation to progress more quickly, and helped the team meet the time frame they had set.

The target timeline outlined in SIQS for a Level 3 investigation was exceeded, despite significant effort by the team to expedite the investigation, indicating that either these targets are unrealistic, the investigation was incorrectly classified, or that other work was impacting the published investigation schedules. This increased the risk that stakeholders’ expectations with respect to timeliness would not be met.

4.3.2 Stakeholders

The support of stakeholders is critical to encouraging early safety action to address issues identified during an investigation. In this investigation, effort to engage stakeholders was evident, including

  • regular briefings for the next of kin and stakeholders on investigation progress;
  • action taken throughout investigation to respond to concerns raised by stakeholders (this led to additional investigation activities including site visits by the investigation team and a consultation with a trauma surgeon); and
  • briefings and discussions with stakeholders throughout the DIP process

The benefits of active engagement with stakeholders are clearly demonstrated through this investigation: stakeholder expectations for timelines were well managed; additional avenues of data collection were identified and pursued; and timely safety action on all four of the safety issues that had been identified was taken before the report was released.

4.3.3 Critical investigation review process

Critical investigation reviews are intended to update ATSB management on the progress of an investigation, answer questions with respect to scope and direction, and address any significant project risks. In the Kangaroo Valley investigation, critical reviews were used to address possible issues concerning the composition of the investigation team and the risks associated with the initial decision not to visit the site. In August 2012, another critical review resulted in the decision to collect additional data to explore organizational issues in the helicopter company and the ambulance service.

This stands in contrast to the Norfolk Island investigation in which critical reviews were ineffective in identifying data-sufficiency issues, and may have been the source of misunderstandings about which data to collect.

The reasons for the differences in the effectiveness of this process are complex. What is evident is that regular reviews during the Kangaroo Valley investigation provided an opportunity for discussion, contributed to revisions of investigation scope, and led to data being collected throughout the investigation to support new avenues of investigation.

4.3.4 Team leader review

The ATSB’s safety investigation guidelines describe the manager (team leader) as having an ongoing responsibility for reviewing the focus and progress of an investigation. It also states that managers will specifically assess the output of any peer review process and assess the standard of the investigation report against ATSB quality objectives.Footnote 30

In the case of the Kangaroo Valley investigation, the time taken for the formal review of the draft reports was brief. The team leader’s review of the initial draft report was completed the same day as the peer review, and the review following the DIP process took two days.

These time frames may be an indication that the team lead had been tracking report progress and had little review left to do, or they may be indicative of a well-written report that required little review. However, because it was the team leader’s practice to review the reports in isolation without reviewing the underlying analysis, it is not clear that the team leader’s review would have identified any quality issues in the draft reports had they been present.

4.4 Findings from the TSB review of the Kangaroo Valley investigation

  1. The Kangaroo Valley investigation is a good demonstration of the potential of the analysis tools to structure thinking, test hypotheses, provide a basis for team discussion, assist in identifying the need for additional data, and expedite report writing.
  2. The target timeline set in SIQS for a Level-3 investigation was exceeded, despite significant effort by the team to expedite the investigation, indicating that either these targets are unrealistic, that the investigation was incorrectly classified, or that other work was affecting the published investigation schedules. This increased the risk that stakeholders’ expectations with respect to timeliness would not be met.
  3. The benefits of active engagement with stakeholders are clearly demonstrated through this investigation: stakeholder expectations for timelines were well managed; additional avenues of data collection were identified and pursued, and timely safety action on all four of the safety issues that had been identified was taken before the report was released.
  4. Regular critical reviews during the Kangaroo Valley investigation provided an opportunity for discussion, contributed to revisions of investigation scope, and led to data being collected throughout the investigation to support new avenues of investigation.
  5. Because it was the team leader’s practice to review the reports in isolation without reviewing the underlying analysis, it is not clear that the team leader’s review would have identified quality issues in the draft reports had they been present.

5.0 Review of the Canley Vale investigation

5.1 Factual information related to the Canley Vale investigation

At approximately 0806 Australian Eastern Standard Time on 15 June 2010 a Piper PA-31P-350 Mojave aircraft registered VH-PGW operated by Skymaster Air Services, with a pilot and a flight nurse on board, collided with terrain in a suburban area about 6 km north-west of Bankstown Airport, New South Wales. At the time of the accident, the pilot was attempting to return to Bankstown following a reported in-flight engine shutdown. Both occupants were fatally injured and the aircraft was destroyed by the impact forces and an intense post-impact fire.

The following issues were examined in the course of the investigation:

  • Pilot training and checking
  • Aircraft performance
  • Pilot performance
  • Flight management
  • Regulatory oversight.

5.1.1 Notification and assessment

The Rescue Coordination Centre notified ATSB of the occurrence just before 0830. The decision was taken to initiate a Level 3 investigation, and an IIC and four additional investigators were assigned to the occurrence. The IIC and one other investigator had an operations background; the other three were engineering investigators. At the time of the occurrence a Level-3 investigation was defined as one that involves in-the-field activity, up to several ATSB and possibly external resources, and/or scale and complexity of which usually requires up to 12 months to complete.Footnote 31

Following a team briefing and preparations, the team departed for the accident site just before noon and was on site by mid-afternoon.

5.1.2 Data collection (June 2010 to February 2012)

5.1.2.1 Site and aircraft examination

The data collection phase was initially focused on activities at the occurrence site, including documenting the wreckage and collecting parts for further analysis. The ATSB assumed control of the accident site from the NSW Police in the afternoon of 15 June and relinquished control of the accident site to the aircraft insurer mid-morning on 17 June.

The team remained in the Sydney area after it had completed its on-site activities to collect documentation from the operator and begin conducting interviews. The three engineering team members returned to Canberra on the evening of 18 June. The operations investigators returned the following afternoon after conducting interviews with pilots from the occurrence operator.

5.1.2.2 Witness interviews

From the date of the occurrence until completion of interviewing activities on 07 November 2011, investigators interviewed witnesses to the accident, air traffic controllers, company managers, other company pilots, next of kin, and inspectors from the Australian Civil Aviation Safety Authority (CASA). Audio recordings of the interviews were filed in the Investigation Information Management System (SIIMS).

5.1.2.3 Weather information

On 24 June 2010, investigators asked the Bureau of Meteorology (BoM) for weather information, and received detailed reports from them on 02 and 29 July 2010.

5.1.2.4 Other information

ATSB investigators conducted examinations at independent maintenance facilities of several aircraft components, including engines (20 to 23 July 2010), propellers (10 to 11 August 2010), turbochargers (09 to 10 September 2010), fuel system components (05 to 07 April 2011) and engine ignition and lubrication systems (19 April 2011). The engine cowl flap actuators were examined at the ATSB technical facilities on 15 November 2011.

Samples from the fuel truck that provided fuel to the aircraft on the day of the occurrence were analyzed by an independent laboratory on 26 August 2010.

A flight was conducted in an exemplar PA-31P-350 aircraft to collect baseline recordings so that spectral analysis could be done of the air traffic control (ATC) communication recordings from the occurrence flight. This flight was originally scheduled for March 2011 but was postponed because the exemplar aircraft was not available, and was made on 01 September 2011. The spectral analysis, completed in February 2012, made it possible to determine engine and propeller settings from recorded ATC audio for the accident flight.

At a meeting on 29 July 2010, the ATSB was briefed on the findings of CASA’s special audit of the operator in the Canley Vale occurrence. It was informed, in particular, that CASA had identified issues with pilot training and aircraft maintenance oversight. It also learned that the special audit had identified that the accident pilot had not been re-qualified as CASA had stipulated in a previous audit, and that CASA had not detected it.

5.1.2.5 Investigation management during data collection phase

The urban location of the accident site resulted in pressure to complete the on-site phase of the investigation quickly, and resulted in significant media presence. An initial briefing with police and fire service covering occupational health and safety and site-handover issues was conducted upon arrival at the occurrence site, and the IIC gave a media briefing shortly thereafter.

The first post-field phase team meeting was held on 23 June 2010. Discussions covered a review of on-site activities, investigation assignments, investigation issues and risks, and the investigation schedule. Frequent and well-documented meetings were held between June 2010 and November 2011 (end of analysis phase).

The IIC briefed ATSB management on 02 July 2010 after the on-site phase had ended, and covered the occurrence response, stakeholder management, potential safety issues, proposed lines of inquiry, potential investigation risks, and lessons learned to date.

A potential safety issue was identified early in the investigation: guidance material relating to the assessment of student competency in managing engine problems or failures during cruise in multi-engine aircraft. An initial briefing on this issue was held during a meeting with ATSB investigators and CASA staff on 29 July 2010, and the IIC and CASA staff had follow-up discussions in September and October 2010 and July 2011.

There was a significant reduction in activity on this investigation between October 2010 and August 2011 due to investigation team members being involved in other, higher-priority investigations.

A critical investigation review was conducted on 11 February 2011. The IIC gave a briefing on the status of the investigation, stakeholder management, investigation issues and risks, investigation scope, and resources. It was at this meeting that senior management approved the resources for the flight in an exemplar aircraft to collect baseline ATC recordings. As well, an additional investigator with a specialization in human factors was added to the investigation team, and a decision was made to expand the scope of the review of CASA surveillance files to include the period in which the operator was purchased by a new owner.

A second critical investigation review was conducted on 07 July 2011. It was decided that a review of operator maintenance would not be included in the scope of the investigation, given time and resource constraints and the fact that the occurrence aircraft had been found not to have maintenance issues.

A communication plan was developed by the IIC early in the investigation and was followed throughout.

5.1.3 Investigation analysis phase (Post-field phase to March 2012)

5.1.3.1 Use of analysis tools

Use of analysis tools began during the first post-field team meeting on 23 June 2010, including sequence of events analysis, safety factor analysis and identification of safety issues. The use of the analysis tools continued throughout the data collection and analysis phases and included

  • evidence tables to test the existence and influence of safety factors and identify additional data collection requirements;
  • an acci-map to show the relationships between safety factors and the occurrence; and
  • a mind-map diagram to show issues that had been considered in the course of the investigation and been ruled out.

A team approach to the completion and review of analysis tools was adopted throughout.

5.1.3.2 Investigation management and additional data collection during analysis phase

The investigation was effectively managed was throughout. Data collection and analysis, discussed above, ran concurrently throughout the post-field phase.

5.1.4 Report preparation (March–December 2012)

5.1.4.1 Preliminary report (July 2010)

A preliminary report was released to the public on 15 July 2010. Preparation had begun in late June and was submitted for management review on 05 July.

5.1.4.2 Final report (March–December 2012)

Team review of the draft final report began on 13 March 2012 and was completed one week later. The revised draft was sent for peer review on 26 March 2012.

Two peer reviewers were assigned to the report: one operations investigator, and one technical investigator with an avionics background. The peer review was very thorough, examining the report and all of the occurrence documentation and analysis tools. It was completed on 08 May 2012.

Revisions to the report based on the peer review were completed on 05 June, and the report was submitted for team leader and GM reviews. These were completed on 09 July and 03 August respectively.

On 06 August 2012, the draft report was sent to directly involved parties (DIPs) and parties with involvement (PWIs) on the authority of the GM. Consistent with the ATSB’s procedures for Level-3 investigations, the Commission did not review the report prior to the DIP process.

Responses from DIPs and PWIs were received by early September. The report was revised and submitted for GM and Commission review on 26 November. Amendments were made after these reviews.

5.1.4.3 Investigation management in the report preparation phase (March–December 2012)

On 05 September 2012, the ATSB, prompted by lessons learned from the Norfolk Island investigation, asked for a copy of CASA’s special audit of the operator in the Canley Vale occurrence, which gave rise to factual information being added to the report.

5.1.5 Report release (December 2012)

The final report was released to the public on 20 December 2012.

On 22 February 2013, the IIC delivered an investigation closure management briefing to the investigation team and ATSB managers. The briefing reviewed the investigation findings, the status of evidentiary material, risks during the investigation, ongoing risks and issues, and lessons learned.

There was also a discussion of whether CASA special audits should be obtained as a matter of course in ATSB investigations. The discussion arose from the fact that even though ATSB investigators had known about the CASA special audit as early as July 2010, the organization had not asked for a copy of it until September 2012 because of the controversy generated by the Norfolk Island report.

5.2 Analysis related to the Canley Vale investigation

Other investigations were included in the TSB Review so that they could be contrasted with the Norfolk Island investigation and bring additional perspective on the utility and application of the ATSB’s analysis and investigation management processes. To that end, this section of the report focuses on analyzing areas where significant similarities or differences were observed in the two investigations. As in the Norfolk Island analysis, this is structured around the main areas of inquiry included in the terms of reference for the TSB Review.

5.2.1 Adequacy of data collection

An effective data collection plan is one that is periodically revisited and revised over the course of an investigation as issues are identified and decisions are made with respect to the investigation’s scope. On a number of occasions in the Canley Vale investigation, requirements for additional data were identified and plans were made to obtain it. Two notable examples were the decision to conduct a flight in an exemplar aircraft to collect audio recordings and aircraft performance data, and the decision taken late in the investigation to obtain the report of the special audit of the operator that CASA conducted post-occurrence.

The question of regulatory oversight was repeatedly revisited throughout the investigation: CASA inspectors were interviewed as part of the investigation; a briefing on the contents of the CASA special audit was obtained relatively early in the investigation; the need to obtain additional CASA surveillance files was raised in the February 2011 critical investigation review; and a copy of the CASA special audit was eventually requested from the regulator. As a result, the ATSB had the information it needed as the investigation progressed to make informed scoping decisions with respect to an investigation of regulatory oversight.

The collection of additional data late in this investigation demonstrates the value of revisiting the data collection plan as needs and expectations change.

5.2.2 Application of the investigation methodology

As described in the discussion of the Norfolk Island investigation, the ATSB investigation guidelines identify four keys to an effective analysis, namely the use of well-defined concepts; a structured approach or process; a team-based approach; and knowledge of the domain being investigated.Footnote 32

It can often be difficult to determine the sequence of events in fatal general aviation accidents. If it is impossible to interview crew members and there are no on‑board recordings, multiple scenarios need to be analyzed to arrive at the most plausible one.

Analysis tools used in team reviews of the Canley Vale investigation showed that more data were needed to determine what happened during the occurrence. This resulted in the flight in an exemplar aircraft to make the baseline audio recordings and collect aircraft performance data. These data were compared with the air traffic control audio recordings, radar data from the occurrence flight, and other information gathered during the course of the investigation to

  • identify the nature of the engine problem;
  • ascertain the actions taken by the pilot to manage the engine problem;
  • analyze the performance of the aircraft with one engine inoperative to determine its effect on aircraft handling; and,
  • determine whether issues identified with the pilot’s initial PA-31 endorsement played a role in the occurrence.

The ATSB analysis methodology was used effectively throughout the investigation. During regular team meetings before the report was drafted, a number of processes—including safety factor maps and evidence tables—were applied to test hypotheses, check data requirements, analyze the data, and review the conclusions.

The TSB Review of the Canley Vale investigation revealed the value of such tools, the following being a case in point.

One of the issues the investigators examined concerned the quality of the pilot’s initial training on the PA-31. The pilot had been identified as one among a group of pilots requiring recertification because of flight-testing practices that were unacceptable to CASA. CASA had stipulated that the occurrence pilot be re-examined for proficiency on the PA-31, but the recertification was never done, and CASA did not detect it.

It is counterintuitive to think that a training and pilot-certification issue would not be considered contributory to an occurrence in which aircraft handling had been a factor. The ATSB investigation determined that although the required re-test did not take place, the pilot had subsequently undergone check rides for both instrument rating renewals and endorsement on another type of aircraft. Therefore, the ATSB concluded that the issue did not contribute to this occurrence because the pilot had demonstrated the required standard of proficiency on a number of occasions prior to the occurrence. In terms of the ATSB’s investigation methodology, the safety factor passed the test for existence but did not pass the test for influence.

Throughout the analysis of the training and pilot-certification issues, the use of evidence tables provided an effective means for team members and reviewers to clearly see the logic underlying the report.

5.2.3 Management and governance of the investigation

5.2.3.1 Regulatory oversight

The Canley Vale investigation report provides a history of the regulatory oversight of the operator. Given that CASA revoked the operator’s certificate as a result of its special audit, it is reasonable to ask whether the conditions that led to the CASA action predated the accident and why CASA had not identified these conditions prior to the occurrence. The ATSB was aware of the contents of the special audit report by February 2011; therefore, information to indicate potential deficiencies in regulatory oversight was available.

The Canley Vale investigation report refers to three issues of regulatory non-compliance that led to the cancellation of the operator’s Air Operator’s Certificate. However, the report concludes that these issues did not contribute to the occurrence. The TSB Review found that the analysis leading to this conclusion was sound, but felt that a more thorough discussion would have clarified the underlying rationale for the decision not to explore the perceived deficiencies in more depth.

5.2.3.2 Stakeholder management

The support of stakeholders is critical to achieving early safety action to address issues identified in an investigation. Early in the Canley Vale investigation, the team identified stakeholders with whom contact should be maintained and developed a communications plan. The team employed a technique not seen in other investigations, which involved identifying the level of interest and influence of individual stakeholders and developing specific plans for communicating with them. One example of this was the safety issue involving the adequacy of the guidance material in the multi-engine training standards, which was initially communicated to CASA at a meeting in July 2010, and which the IIC followed up with CASA periodically afterward.

In the Canley Vale investigation, regular contact with stakeholders resulted in timely safety action.

5.2.3.3 Critical investigation reviews

Critical investigation reviews are conducted to provide an opportunity to check on the course and progress of an investigation. Similarly to the Kangaroo Valley investigation but in contrast to the Norfolk Island investigation, the Canley Vale critical investigation reviews added value in that the review of the scope of the investigation resulted in well-documented decisions and periodic revision of data collection plans.

5.2.3.4 Peer review process

The scope of the ATSB peer review process varied considerably across the three investigations reviewed by the TSB team. In the Canley Vale investigation, the peer review process was more robust than those of the other investigations in that two reviewers did an in‑depth examination of the data collection and analysis in the electronic occurrence workspace. For example, all of the evidence tables were reviewed, and several administrative issues were identified.

5.2.3.5 Closure briefing

Unlike the other two investigations the TSB reviewed, the Canley Vale investigation included a closure briefing, which provided an opportunity for a discussion of the lessons learned. It was during the closure briefing, for example, that the question of when the ATSB should request copies of CASA special audit reports arose.

5.3 Findings from the TSB review of the Canley Vale investigation

  1. On a number of occasions in the Canley Vale investigation, requirements for additional data were identified and plans were made to obtain it.
  2. The ATSB had the information it needed as the investigation progressed to make informed decisions with respect to investigating regulatory oversight.
  3. The collection of additional data late in the investigation demonstrates the value of revisiting the data collection plan as needs and expectations change.
  4. The use of evidence tables provided an effective means for team members and reviewers to clearly see the logic underlying the report.
  5. The analysis section of the Canley Vale investigation report states that issues of regulatory non-compliance did not contribute to the occurrence. The analysis leading to this conclusion was sound, but a more thorough discussion would have clarified the underlying rationale for the decision not to explore the perceived deficiencies in more depth.
  6. In the Canley Vale investigation, regular contact with stakeholders resulted in timely safety action.
  7. Similarly to the Kangaroo Valley investigation but in contrast to the Norfolk Island investigation, the Canley Vale critical investigation reviews added value in that the review of the scope of the investigation resulted in well-documented decisions and periodic revision of data collection plans.
  8. Unlike the other two investigations the TSB reviewed, the Canley Vale investigation included a closure briefing, which provided an opportunity to discuss lessons learned.

6.0 Summary and recommendations to the ATSB

This section synthesizes the findings from the comparison of TSB and ATSB methodologies and the three investigations reviewed, highlighting lessons learned and outlining recommendations to the ATSB.

The section is organized around the five areas of inquiry in the terms of reference for the TSB Review: review of investigation methodology; application of the ATSB investigation methodology; management and governance of the investigations; investigation report processes and external communications.

6.1 Review of investigation methodology

6.1.1 Value of ATSB investigation methodology and analysis tools

A robust investigation methodology used by all investigators has many benefits, including a common understanding of the etiology of accidents; a systematic and iterative method for collecting and analyzing occurrence information; and a framework for collaboration, including sharing and challenging analyses and conclusions.

The TSB Review clearly demonstrates the significant similarities between the ATSB and TSB methodologies. This is not surprising given that the methodologies are used for a common purpose and based on the same underlying models. These models stress the need to work from the occurrence events and progress systematically to other areas of the system to identify and address the underlying factors.

Recommendation #1: Given that the ATSB investigation methodology and analysis tools represent best practice and have been shown to produce very good results, the ATSB should continue efforts to ensure the consistent application and use of its methodology and tools.

6.1.2 Quality controls for report review processes

The value of any methodology is realized through consistent and skilled application. The TSB Review of the three investigations found significant differences in the manner in which the investigation methodology was applied, which produced variability in investigation quality and defensibility. These differences are the subject of the next section covering the application of the ATSB methodology. This section focuses on weaknesses in the ATSB report review processes, which allowed these differences to escape attention.

The TSB Review found that weak application of the ATSB methodology was not addressed in the Norfolk Island investigation due to systemic deficiencies in the review processes in place. Report reviews set out in the ATSB's Safety Investigation Quality System include peer review, review by directly involved parties (DIPs) and Commission review. While all of these reviews identified issues with the quality and defensibility of the Norfolk Island report, all three review processes were found to lack controls to ensure that the issues that had been identified were effectively dealt with.

All three of these review processes rely exclusively on the IIC to effectively address the issues reviewers raise. The system requires that IICs consider the comments made by peer reviewers and DIPs and documents his or her response using forms in SIIMS. However, there is no mechanism to ensure that the IIC has addressed each comment or to assess the adequacy of each response.

While the consequence of these gaps was observed only in the Norfolk Island investigation, the lack of controls within these processes could prevent data insufficiency and weak analysis from being addressed in other investigations.

Recommendation #2: The ATSB should consider adding mechanisms to its review process to ensure there is a response to each comment made by a reviewer, and that there is a second-level review to verify that the response addresses the comment adequately.

One significant difference in approach between the TSB and ATSB relates to the feedback given to external report reviewers. In Canada, staff prepare a written response describing the Board's response to each comment and what changes, if any, have been made to the report in consequence. Once approved by the Board, the responses are provided to the reviewer.

In Australia, IICs record their responses to DIP comments in the tables in SIIMS, but no written response is provided to DIPs. This lack of communication may make it difficult for DIPs to understand the action taken in response to their submissions, which risks diminishing their acceptance of the report. In addition, because there is no formal process to ensure that the Commission reviews staff responses to DIPs' comments, the Commission may not detect those that have not been properly addressed, resulting in weaknesses in the published report.

Recommendation #3: The ATSB should augment its DIP process to ensure the Commission is satisfied that each comment has been adequately addressed, and that a response describing actions taken by the ATSB is provided to the person who submitted it.

6.1.3 Use of risk labels

The ATSB SIQS requires that a risk assessment be conducted prior to initiating any safety communication or safety advisory notice. It has also been the ATSB's practice to publish the level of risk resulting from these risk assessments for any safety factor identified in an investigation report.

The use of these risk labels was problematic in the Norfolk Island investigation in that the safety issue communicated to the Australian Civil Aviation Safety Authority (CASA) was initially described as critical, but then later described in the report as minor. The conduct of a risk assessment is a valuable exercise in developing a compelling safety argument. However, in the Norfolk Island investigation, the use of level-of-risk labels when communicating safety issues did not contribute to advancing safety, and focused discussion on the label rather than on the identified issue and the potential means of its mitigation.

Recommendation #4: The ATSB should review its risk assessment methodology and the use of risk labels to ensure that risks are appropriately described, and that the use of the labels is not diverting attention away from mitigating the unsafe conditions identified in the investigation.

6.1.4 Target timelines

The timelines set for the completion of ATSB investigations were exceeded in all three of the investigations in the TSB Review. In the Norfolk Island and Canley Vale investigations, the TSB Review found systemic factors that contributed to delays in producing the reports, including the completion of high-priority investigations and availability of staff and equipment. The Kangaroo Valley investigation also exceeded the target timelines set in ATSB SIQS despite the team's significant efforts to expedite the investigation and the absence of systemic factors leading to delays. This indicates that the targets are unrealistic, that the investigation was incorrectly classified, or that other work was affecting the published investigation schedules. This increases the risk that stakeholders' expectations with respect to timeliness will not be met.

Recommendation #5: The ATSB should review its investigation schedules for the completion of various levels of investigation to ensure that realistic timelines are communicated to stakeholders.

6.2 Application of investigation methodology

6.2.1 Approach to use of analysis tools

The ATSB SIQS emphasizes the keys to an effective analysis, which include a systematic approach to ensure that the safety issues are identified on the strength of the data, a team approach to bring different perspectives and challenge assumptions, and an iterative approach to ensure new avenues of inquiry, are properly explored.

In the Canley Vale and Kangaroo Valley investigations, analysis tools including all-factor maps and evidence tables were used to illustrate the relationships between occurrence events and underlying factors, and to test various hypotheses in the course of the analysis. Both investigations successfully determined a most-probable sequence of events from a number of possible scenarios and clearly demonstrated the link (or lack thereof) between the sequence of events and the underlying factors using a bottom-up approach. In both of these investigations, the IICs were diligent in using the analysis tools in a team setting to challenge their thinking. In addition, data collection plans in both investigations were regularly updated as a result of using these tools.

In the Norfolk Island investigation, the application of similar analysis tools by an individual working in isolation using a top-down approach resulted in the tools being used to substantiate the identified safety issues, rather than to challenge them. Throughout the investigation the IIC remained convinced of the validity of the high-level issues. Identified deficiencies in data collection went largely unaddressed by the IIC. The weaknesses in the quality control processes described in the previous section allowed this situation to continue.

The TSB analysis of the three investigations highlighted the value of the ATSB analysis tools in providing a framework for the systematic, team-centred, iterative approach sought by the ATSB. On the other hand, the TSB Review also found that applying the analysis tools in isolation increases the risk that conclusions will not be defensible.

Recommendation #6: The ATSB should take steps to ensure that a systematic, iterative, team approach to analysis is used in all investigations.

6.2.2 Analysis of fatigue

Data collected during the Norfolk Island investigation were insufficient to conduct a proper fatigue analysis. The ATSB analysis methodology requires a test for existence and test for influence of any safety factor, and is consistent with best-practice approaches to analyzing sleep-related fatigue. However, there are gaps in the guidance provided to ATSB investigators on the collection of sufficient data and on the criteria for determining whether a state of fatigue exists.

The ATSB would benefit from adoption of a specific tool to guide data collection and analysis in the area of fatigue.

Recommendation #7: The ATSB should provide investigators with a specific tool to assist with the collection and analysis of data in the area of sleep-related fatigue.

6.3 Management and governance of the investigation

6.3.1 Management oversight of investigations

In the Norfolk Island investigation, problems arose—and persisted—due to a combination of factors, including weaknesses in management oversight of the investigation process, a transition in management positions, competing demands on the team leader from other high-profile investigations, and a backlog of investigation reports that were diverting the GM's attention.

These organizational issues and competing priorities did not have a negative impact on the Kangaroo Valley and Canley Vale investigations, but the short amount of time the team leader spent on the review of the Kangaroo Valley report may indicate that a similarly hands-off approach was taken in other ATSB investigations.

In the period since the Norfolk Island report, new team leaders have been appointed, and they are taking steps to implement a more hands-on approach to the management of their teams' investigations, including early reviews of the data collected and more active involvement in all phases of the analysis. While these steps should be effective in addressing the weaknesses observed in the Norfolk Island investigation, they are highly dependent on the initiative of the new team leaders themselves, and need to be fully integrated into SIQS to manage the risks associated with a change in personnel in these key positions.

Recommendation #8: The ATSB should review the quality assurance measures adopted by the new team leaders and incorporate them in SIQS to ensure that their continued use is not dependent on the initiative of specific individuals.

6.3.2 Commission oversight of investigations

The TSB Review of the Norfolk Island investigation found that the ATSB Commissioners may not have sufficient control over the content and quality of the report that they will ultimately be responsible for approving for release.

The ATSB policy that allows certain reports to be released for DIP review without Commission review means that in many cases, the first Commission review of draft reports takes place after the DIP process has been completed, at a point where issues that have been identified are difficult or impossible to address.

In addition, there is no formal mechanism for the Commission to communicate with staff or to ensure that issues raised by the commissioners are systematically addressed. In the case of the Norfolk Island investigation, after the second DIP process, the Commission sent comments on the draft report by e-mail, asking why more use had not been made of the CASA special audit in the analysis section of the report. It is not clear that staff considered and addressed these comments before the report was released, and there was no mechanism to ensure they would be.

Recommendation #9: The ATSB should modify the Commission report review process so that the Commission sees the report at a point in the investigation when deficiencies can be addressed, and the Commission's feedback is clearly communicated to staff and systematically addressed.

The policy under which the Commission chooses which draft reports to review before sending them to the DIPs and the means by which the Commission communicates with staff may both be a necessary result of the structure and function of the Commission. The ATSB Commission consists of one full-time member, who also serves as the chief executive, and two part-time commissioners who are geographically dispersed.

In contrast, the TSB has a dedicated chairperson, two full-time board members and two part-time board members who typically work three days per week. This allows the Board to review each report in committee twice (pre-DIP and post-DIP) and to provide specific direction to staff. All reports are approved by the Board before they are released publicly, and the Board also approves all responses to the DIPs' representations. The differences between the accountability of the Board and the accountability of the directors of investigations and their staff are clearly defined. While this review and approval process can be time consuming, its rigour demonstrates that the Board has exercised its responsibilities as the ultimate authority for the approval of TSB reports.

Recommendation #10: The ATSB should undertake a review of the structure, role, and responsibilities of its Commission with a view to ensuring clearer accountability for timely and effective oversight of the ATSB's investigations and reports.

6.3.3 Investigation scope and documentation of critical reviews

The Norfolk Island investigation did not adequately address a number of important issues, and the main reason for this, the TSB Review found, was that there were insufficient data to support effective analysis. Gaps in data collection did not result from deliberate decisions to limit the scope of the investigation, but from misunderstandings and ineffective application of the investigation methodology.

In contrast, decisions related to scope in the Canley Vale and Kangaroo Valley investigations were explicit and well documented during the critical review process. In some cases, these decisions involved pursuing specific lines of inquiry, for instance, the organizational issues in the Kangaroo Valley investigation. In other cases, the scope of the investigation was limited, as evidenced in the Canley Vale occurrence by the decision not to pursue an investigation of maintenance oversight because the occurrence aircraft had not been found to have any maintenance issues.

Although decisions with respect to investigation scope can be contentious, clearly documenting them demonstrates the rationale for them, resulting in decisions that are more defensible and in greater confidence in the investigation process.

The TSB Review found that the effectiveness of the critical review process varied among the investigations, largely as a result of individual differences in how the reviews were conducted and documented.

Recommendation #11: The ATSB should adjust the critical investigation review procedures to ensure that the process for making and documenting decisions about investigation scope and direction is clearly communicated and consistently applied.

6.3.4 Closure briefing

Of the three investigations reviewed, only Canley Vale held a closure briefing. The notes from this briefing indicate that the briefing provided a good opportunity to reflect on lessons learned and encourage continuous improvement of ATSB processes.

Recommendation #12: The ATSB should take steps to ensure closure briefings are conducted for all investigations.

6.3.5 Roles of ATSB and CASA and investigation of regulatory oversight

In the Norfolk Island investigation there was a misunderstanding about the responsibilities of CASA and the ATSB as they conducted their respective investigations. This misunderstanding was never resolved, and had a significant effect on the ATSB's data collection, resulting in a missed opportunity to analyze the company's and the regulator's oversight activities.

Recommendation #13: The ATSB should provide clear guidance to all investigators that emphasizes both the independence of ATSB investigations, regardless of any regulatory investigations or audits being conducted at the same time, and the importance of collecting data related to regulatory oversight as a matter of course.

6.4 External communication

In the Norfolk Island investigation, although senior managers were aware of the potential for the report to generate some controversy when it was released, communications staff were not consulted and no communications plan was developed to address it. Once the investigation became the subject of an external inquiry, the ATSB could no longer comment publicly on the report, which hampered the Bureau's ability to defend its reputation.

Recommendation #14: The ATSB should implement a process to ensure that communications staff identify any issues or controversy that might arise when a report is released, and develop a suitable communications plan to address them.

7.0 Best practices to bring back to the TSB

It was understood from the outset that the TSB Review would be a learning opportunity for the Canadian organization. It is unprecedented for a major safety bureau to invite another to review its governance framework and investigation tools, and to be afforded an in-depth look at how it applies them.

The purpose of this section is to briefly describe the best practices that the TSB Review team observed in the course of its work.

A full analysis of how such practices would be implemented at the TSB is outside the scope of this report, and a discussion of the potential benefits of these practices has been deliberately avoided here. In some cases, the TSB employs similar methods, but might look to the ATSB examples to modify and improve its approach to certain phases of its investigations. This section is intended as a starting point, and the reader is directed to the references provided to ATSB's SIQS documentation for a more detailed view of these practices.

Ultimately, the value of the practices described here depends on the effectiveness of their implementation. The TSB Review provides some important insights into the organizational support required to make these practices work, and the discussion below should be considered in the context of the entire report.

7.1 Internal investigation reviews

7.1.1 Critical investigation review

The ATSB Safety Investigation Policy and Procedures Manual states that "All investigations classified at Level 4 or higher, must include a critical investigation review no later than the commencement of the preparation of the safety analysis section of the Draft Final Report."Footnote 33

The critical investigation review is held in addition to a post-field phase management briefing and an investigation closure briefing. The intent is to conduct the review at the point in the investigation where the focus changes from data collection to analysis and report writing.

The review is to include a discussion of the investigation findings, the management of the investigation, and how any investigation risks are being addressed. Participation is commensurate with the scale of the investigation. In addition to investigation team members and peer reviewers, team leaders are expected to participate in all critical investigation reviews, while the GM may participate in any review and is expected to participate in reviews of Level 1 and Level 2 investigations. The chief commissioner is expected to participate in critical reviews of any investigation that the Commission is to review, and the entire Commission is expected to participate in reviews for Level 1 and Level 2 investigations.

The briefing is organized and run by the IIC. The Policy and Procedures Manual specifies that PowerPoint may be used but the preparation of briefing materials should not take the IIC away from other work.

7.1.2 Peer review process

At the TSB, designated Standards and Performance investigators review reports, whereas the ATSB has a peer review process to which any investigator can be assigned.

The team leader selects peer reviewers, taking into account individual workload, any specialist expertise required, and prior involvement in the occurrence.

The review is intended to verify that the investigation is complete; that the findings are supported by the facts; that no important elements have been left out; and that the report conforms to ATSB structure and writing style.Footnote 34

Peer reviewers can use a checklist in SIIMS to guide their review, and then record their observations in a Web-based form. The IIC may also use the Web-based form to record actions taken in response to each point the peer reviewer raised.

The TSB review of the Norfolk Island, Kangaroo Valley, and Canley Vale investigations found that the peer review process was inconsistently applied, and that it was ineffective in addressing the quality issues in the Norfolk Island investigation. However, in that instance, the shortcomings were due largely to its inappropriate use as a way to replace direct intervention by managers, and to the results of the peer reviews not being followed up effectively. In the other two investigations, the process proved itself to be practical and to add value to the investigation.

7.1.3 Closure briefings

The investigation closure briefing is given by the IIC to ATSB management, and the level of participation in it is commensurate with the scope of the investigation. The briefing is to cover project risks identified during the investigation, outstanding safety action, and lessons learned in the course of the investigation. The ATSB procedures manual suggests that the briefing focus equally on areas for improvement and what went well during the investigation.Footnote 35

The advantage of making a closure briefing a formal requirement is that it causes everyone involved in an investigation to stop and reflect on it, creating the opportunity for continuous improvement.

7.2 Analysis tools

7.2.1 Identification and tracking of safety issues

The ATSB's practice is to identify safety issues that are systemic and need to be addressed in order to reduce risk, and to track the adequacy of safety action taken in consequence. In this way, in addition to monitoring responses to its recommendations, the ATSB has the means to demonstrate the extent and adequacy of safety action taken to correct all higher-risk safety issues identified during their investigations.

Only certain findings in ATSB reports are labelled safety issues. Safety issues are defined as safety factors that

  • can reasonably be regarded as having the potential to adversely affect the safety of future operations; and
  • are characteristic of an organization or a system rather than of a specific individual, or are characteristic of an operational environment at a specific point in time.Footnote 36

Safety issues are rated critical, significant, or minor according to risk. Safety action taken on safety issues is described in the Safety Action section of the investigation report, and is available in a searchable database on the ATSB website (Figure 1). The adequacy of safety action for critical and significant safety issues is assessed and tracked.

The adequacy of safety action taken is one of the measures used to assess the ATSB's effectiveness. Two of the ATSB's key performance indicators are safety action taken to address 100% of critical safety issues identified, and 70% of significant safety issues identified. In fiscal year 2012–13, the ATSB identified 0 critical, 34 significant, and 49 minor safety issues across all modes of transport. They reported that 71% of significant safety issues had been adequately addressed, 3% had been partially addressed, and 26% still had safety action pending.Footnote 37

Figure 1. Screen capture of safety issue from Kangaroo Valley InvestigationFootnote 38
Image of a screen capture of safety issue from Kangaroo Valley Investigation

7.2.2 Evidence tables

The ATSB's Safety Investigations Guidelines Manual – Analysis describes the format and use of evidence tables.Footnote 39 Essentially, evidence tables provide a framework for identifying a potential finding (or hypothesis) and listing the available information that supports or refutes the finding. This allows the comparison of the relative strengths of different possible findings.

Evidence tables for safety factors provide a place to record the results of the tests for existence, influence, and importance, which are used to determine whether the finding relates to cause, and whether it will be considered a safety issue (Figure 2).

Figure 2. Reproduction of the ATSB's safety factor evidence table
Image of a reproduction of the ATSB's safety factor evidence table

7.3 Investigation management

7.3.1 Investigation log

The ATSB makes extensive use of an investigation log within SIIMS. All three investigations reviewed had extensive entries in the log, which provided a detailed chronological sequence of investigative events with hyperlinks to key documents pertinent to each one.

7.4 External communications

7.4.1 Communication with stakeholders during DIP process

The TSB review team observed that, depending on a number of factors including investigation scope, complexity and sensitivity, ATSB investigators met in person with stakeholders during the DIP process.

There is no question that delivering a safety message and receiving a response from stakeholders has to be done in writing. Nonetheless, both parties are more likely to understand each other's (possibly different) perspectives better when they talk to each other face-to-face. A conversation affords an opportunity to develop a safety message that addresses the objections that stakeholders might have, increasing in turn the probability of effective safety action being taken.

7.4.2 ATSB short investigations

For Level 5 investigations, the ATSB publishes brief summaries of factual information and safety action taken as a result of the occurrence. Short investigations may also contain a brief safety message or links to additional material.Footnote 40

The short investigations are compiled and published in the form of bulletins.Footnote 41 The aviation investigation bulletins are issued more or less monthly and contain about a dozen short reports.

7.4.3 Summary page for investigation reports

Each ATSB investigation begins with a summary of what happened and what the ATSB found, and highlights the safety action taken and the key safety message—all in a short format that is easy to read and is likely to be read.

Figure 3. Safety Summary page from Kangaroo Valley Report (AO-2011-166)
Image of the Safety Summary page from Kangaroo Valley Report (AO-2011-166)
Date de modification :