The crash of Air India Flight 171 on June 12, 2025, killed 260 people and triggered immediate scrutiny into what went wrong. Unlike many aviation disasters linked to weather, technical failure, or miscommunication, early findings in this case pointed to deliberate human action. Preliminary reports confirmed that both engine fuel switches were manually turned off seconds after takeoff—an action that required intent, not accident. As investigators examined cockpit audio, pilot histories, and aircraft systems, questions emerged about mental health oversight, operational safeguards, and whether existing safety protocols were ignored or insufficient. This wasn’t just a technical failure—it was a human systems failure. The tragedy has reignited debate over how pilots are evaluated, how airlines respond to known safety risks, and what accountability should look like when trust in aviation systems is shaken.

A Catastrophic Crash Under Unusual Circumstances
The crash of Air India Flight 171 on June 12, 2025, is being investigated not just as a technical failure, but potentially as a deliberate act by one of the pilots. The Boeing 787 Dreamliner went down shortly after takeoff from Sardar Vallabhbhai Patel International Airport, killing all 241 people on board and 19 more on the ground. The Aircraft Accident Investigation Bureau (AAIB) found that both engine fuel switches were manually turned off seconds after takeoff—something that cannot happen by accident due to their locking mechanisms, which require deliberate physical action. The landing gear was never retracted, and while one engine began to relight, the aircraft didn’t have enough time or power to recover before crashing into a residential area.
According to the preliminary report, cockpit voice recordings captured one pilot questioning the shutdown—“Why did he cut off?”—with the other denying responsibility. Investigators are now looking at whether this was a catastrophic error or an intentional act. Captain Sumeet Sabharwal, who was monitoring the takeoff, had over 8,200 flight hours and had reportedly been grieving the recent loss of his mother. Aviation safety expert Captain Mohan Ranganathan claimed that Sabharwal had taken multiple periods of medical leave in the past few years and may have been experiencing psychological strain. However, an official from Air India’s parent company, Tata Group, disputed these claims, stating that there were no recent records of medical leave and that both pilots had passed the required Class I medical exams.
Other potential causes, including mechanical malfunction, are still being evaluated. The ram air turbine (RAT), which provides emergency backup power, was deployed during the incident, signaling a complete power loss. At the crash site, the fuel switches were found in the “run” position, suggesting an attempt was made to restore power after the engines cut out. It remains unclear whether the switch-off was part of a misjudged cockpit decision, a tragic mental health crisis, or something else entirely. Fuel samples were deemed satisfactory, and no bird activity or weight imbalance was detected, ruling out several common causes of engine failure.
As the investigation continues, families of victims have pushed back against what they see as a rush to blame the pilots, especially when they’re no longer alive to respond. Ameen Siddiqui, who lost six family members in the crash, told The Telegraph that he believes the report is a cover-up to shield the airline and government from responsibility. While experts like Ranganathan argue that the deliberate movement of both fuel levers is difficult to explain any other way, public skepticism remains high. The crash has opened an urgent discussion in aviation about how mental health is monitored, reported, and managed, particularly when the consequences of oversight can be catastrophic.
Mental Health in Aviation – A System Under Pressure
The Air India Flight 171 tragedy has reignited scrutiny over how mental health is monitored in aviation—a field where personal struggles can have high-stakes consequences. Both pilots had passed India’s Class I medical evaluation, a certification that includes physical and psychological assessments. But passing a medical exam doesn’t always mean a pilot is mentally well. Experts have long pointed out gaps in the system, especially when it comes to detecting depression, burnout, or grief, which may not always show up during routine screenings. In this case, Captain Sabharwal had reportedly taken time off in past years for medical reasons, and sources claimed he was still dealing with the emotional impact of his mother’s recent death. While he was “medically cleared” to fly, it raises the question: how effectively are psychological stressors being evaluated and tracked over time?
Unlike physical health conditions, mental health issues are often underreported or hidden due to stigma and fear of professional consequences. Pilots may avoid seeking help because they worry about being grounded or labeled unfit to fly. This creates a dangerous dynamic where even serious psychological distress can go unnoticed or unaddressed until something goes wrong. Dr. Quay Snyder, president of the U.S.-based Aviation Medicine Advisory Service, has noted in past discussions that pilots often operate in a system that prioritizes disclosure but does little to protect them once they do. The expectation is that pilots self-report mental health issues—but without robust safeguards or support systems, many simply don’t. In the Air India case, if Captain Sabharwal was indeed struggling, it’s unclear whether he had access to the kind of confidential support or structured follow-up that could have made a difference.
Globally, regulations around pilot mental health are inconsistent. Following the Germanwings Flight 9525 crash in 2015, where a co-pilot deliberately flew an Airbus A320 into the French Alps, several international aviation bodies revised protocols, including mandatory psychological assessments and peer-support programs. However, implementation has varied. In India, mental health remains a weakly enforced component of pilot screening. The Civil Aviation Requirements (CAR) guidelines require psychological assessments only in specific scenarios, such as when a pilot is returning to duty after long absences. Routine evaluations still rely heavily on self-disclosure and observational assessments by aviation medical examiners—an approach that’s vulnerable to human error and bias.

Safety Protocols and Systemic Gaps
Beyond mental health concerns, the crash of Flight 171 has also raised pointed questions about the safety systems and maintenance protocols in place at Air India and more broadly across the aviation industry. One of the most alarming aspects of the incident is the manual shutdown of both engine fuel switches just after takeoff—an action that requires deliberate manipulation due to their locking mechanisms and protective guard brackets. These switches are not designed to move easily or accidentally. According to Captain Mohan Ranganathan, a veteran aviation safety expert, this type of mechanical setup is specifically intended to prevent unintended movement during flight operations. The fact that both levers were switched off at such a critical moment, and then reset to “run” seconds later, indicates either a severe lapse in cockpit judgment or an intentional act—either way, a systemic failure allowed this to happen.
Compounding this is the question of compliance with international safety advisories. In 2018, the U.S. Federal Aviation Administration (FAA) issued a bulletin warning that in some Boeing aircraft, the locking feature of the fuel switches could be disengaged, increasing the risk of inadvertent operation. Although the FAA’s recommendation was advisory rather than mandatory, it urged airlines to inspect and confirm the proper functionality of these switches. There is currently no public confirmation that Air India performed such checks on the aircraft involved in the crash. If the safety inspection wasn’t carried out—or if it was inadequately documented—it reflects a broader issue: safety protocols are only as effective as their implementation and follow-through.
The deployment of the ram air turbine (RAT) shortly after takeoff further signals a total loss of electrical and hydraulic power, which should never occur under normal circumstances. The RAT is a backup system activated in emergencies when both engines fail or power is lost entirely. Its appearance in this case confirms the severity of the situation, yet it also shows that the systems worked as designed under duress. Still, these backup systems are last-resort measures—they are not intended to compensate for deliberate or catastrophic missteps during basic phases of flight. The aircraft had no dangerous cargo, the weight was within limits, fuel samples were clean, and weather conditions were not a factor. That narrows the potential root causes to human action and oversight.
What this incident reveals is a layered failure—where individual behavior, procedural enforcement, and organizational accountability intersected with fatal consequences. It’s not enough to certify that a switch works or that a pilot passed a medical test. Systemic safety requires redundancy not just in machines, but in policies, training, and internal culture. Airlines must not treat safety advisories as optional, especially when they concern components tied directly to engine control. Investigations like this one expose more than a single point of failure—they highlight the cumulative effect of overlooked guidelines, inconsistent enforcement, and the absence of proactive risk management.
Accountability, Transparency, and the Public Response
As details surrounding the Flight 171 crash continue to unfold, public reaction has been sharply divided between trust in the investigation process and suspicion that key information is being suppressed to protect institutional interests. Families of the victims have publicly challenged the official narrative, accusing Air India and government agencies of shifting blame to the deceased pilots—particularly Captain Sumeet Sabharwal—who cannot respond or defend their actions. Ameen Siddiqui, who lost six family members in the crash, called the preliminary report “a cover-up,” questioning how such a critical system failure could be pinned solely on pilot error without exploring broader operational or mechanical issues. His concerns reflect a wider discomfort with the lack of immediate transparency and the perception that accountability is being sidestepped.
This tension is not new in aviation disasters, particularly those involving potential human error. Airlines and regulators often face intense pressure to maintain public confidence, but when communication is vague or deflective, it undermines credibility. In this case, Air India acknowledged the preliminary report and offered public condolences, but declined to comment further, citing the ongoing investigation. While standard procedure during active inquiries, the absence of detail has allowed skepticism to fill the void. Statements from experts like Captain Ranganathan—who asserted that the fuel switch positions “had to be deliberately done”—have only fueled the narrative that something deeper and possibly preventable occurred. Yet no conclusive motive has been established, and Air India has maintained that both pilots were certified fit to fly.
There’s also a broader concern about how investigations are framed and communicated. The AAIB’s preliminary report raises legitimate technical questions, but it has not fully addressed whether all prior safety advisories were implemented or how pilot mental health histories are monitored over time. Without clear disclosures on these issues, it’s difficult for the public to assess whether this tragedy resulted from isolated human error, a breakdown in institutional oversight, or both. In high-fatality events like this, transparency isn’t just a matter of ethics—it’s essential to restoring trust and preventing recurrence.
At the center of all this are the 260 lives lost, many of whom were families in transit, business travelers, and even children—eleven of whom were onboard, including two newborns. The only survivor, Vishwash Kumar Ramesh, lived through the crash but lost his brother, who was seated just across the aisle. These stories drive home the human toll behind technical failures and policy shortcomings. The demand for accountability is not only about assigning blame—it’s about ensuring that structural weaknesses are acknowledged and fixed, so that another set of families doesn’t face the same unanswered questions.

Practical Takeaways for Passengers and the Public
While the cause of the Flight 171 crash is still under investigation, the tragedy highlights important realities about flying that the public should be aware of—especially when it comes to safety, transparency, and mental health in aviation. First, it’s worth remembering that commercial air travel remains statistically very safe. Accidents involving deliberate actions or major systemic failures are rare, but when they happen, they reveal where systems need to improve. For passengers, this doesn’t mean avoiding flying—it means being informed about how airlines handle safety, pilot screening, and crisis response. Travelers can look into an airline’s safety record, which is often publicly available, and understand that international carriers are held to different regulatory standards depending on the country.
Second, the conversation around pilot mental health isn’t just an internal industry issue—it affects everyone who boards a plane. One thing passengers often don’t realize is that many pilots work under intense schedules and pressure, and mental health support systems vary significantly between airlines. After the 2015 Germanwings crash, several countries added psychological assessments and peer-support networks for pilots, but implementation remains inconsistent. Passengers have no role in screening pilots, of course, but they can support calls for more rigorous, confidential mental health frameworks across airlines. That means backing policies that encourage early help-seeking by pilots—without fear of job loss or stigma—and holding regulators accountable for enforcing those policies.
Third, it’s important to recognize what “accountability” looks like in aviation. Most people aren’t following investigative reports or reading FAA bulletins, but these documents often reveal whether airlines and aviation authorities are addressing known risks. For instance, the FAA’s advisory about the fuel switch locking mechanism was issued in 2018—well before the Flight 171 crash—but was not mandatory. This type of gap between recommendation and enforcement is where oversight often falls short. Public pressure and informed media coverage can help push for stricter compliance, not just in response to tragedy, but to prevent it.

