The accident could not have come at a worse time for Uber. Already under scrutiny following several high-profile lawsuits for sexual harassment, intellectual property theft and other accusations that played at least as significant role in founder and former CEO Travis Kalanick’s departure, the ride-hailing firm’s driverless test vehicle was involved in a fatal pedestrian accident earlier this year. It was later revealed the test driver, who was paid to take over driving in case of an emergency, was watching an episode of “The Voice” and thus too distracted to notice a pedestrian who suddenly appeared out of the darkness directly into the car’s path.
Pending the final verdict of the National Transportation Safety Board (NTSB) investigation, it remains to be seen whether the Uber test driver could or should have been able to avoid the pedestrian or had time to apply the brakes to lessen the impact. However, the Uber SUV’s self-drive sensor system reportedly failed to detect and react to the pedestrian due to software failings—in other words, the system failure likely can be attributed to just plain bad code.
Prior to the accident, Uber software developers and engineers were already struggling with ethical concerns regarding algorithms they created that, among other things, allegedly were designed to exploit underpaid drivers, deny services to passengers and skirt governmental regulations. A former software engineer, Susan Fowler, also described a toxic culture when filing a sexual harassment legal complaint against Uber.
However, the allegations levied against the company, even if true, can be seen as an anomaly in software development and DevOps culture in general. One can suppose, for example, that basic human decency dictates most software developers refrain from engaging in unethical behavior and designing code for nefarious reasons. Data from recent studies supports this assumption.
According to a recently published meta study by Stack Overflow, only a fraction of more than 100,000 developers surveyed said they would write unethical code or “that they have no obligation to consider the ethical implications of code.” When asked what they would they do if asked to write code for an unethical purpose, 58.5 percent reported they would not, while 36.6 percent said it depended on what it was. Only 4.8 percent said they would.
However, the report describes a lot of “ethical gray.” “Developers are not sure how they would report ethical problems and have differing ideas about who ultimately is responsible for unethical code,” Stack Overflow said in a statement.
Meanwhile, the fact that many of Uber’s arguably poor ethical choices originated from upper management and its DevOps team points up how, conversely, it is up to DevOps to instill ethical guidelines for organizations to follow. To that end, DevOps must create, among other things, “full audit trails showing who developed what code and when and a culture that enables developers to voice ethics concerns without fear,” said Torsten Volk, an analyst for Enterprise Management Associates (EMA).
“Knowing that a full audit trail is in place will make DevOps more likely to raise a red flag with their team leads,” Volk said. “Then, ideally, the outcome of this discussion should be documented in a source code management system.”
In addition to sending the message to their developers that following ethical principles during code creation is the right thing to do, organizations should support an ethical culture for software development—as well as for the entire enterprise.
“It is critical for the entire business to understand the net positive impact of ethical conduct in development on the businesses risk profile and overall bottom line,” Volk said. “Similar to GDPR compliance, there should be awards for outstanding ethical conduct in coding that can be published to the outside, showing customers the value of doing business with this company.”