‘‘Ain’t No One Here But Us Social Forces’’: Constructing the Professional Responsibility of Engineers by Michael Davis
- Disregarding the magnitude of the responsibility seems incredibly disingenuous. The best analogy I can make is this: Alcohol surely may make you do something that you otherwise might not do, and consuming alcohol, by nature of knowing that, make me responsible for my actions. While I may do something that I may not otherwise do, I take on the responsibility of these intoxicated actions by drinking in the first place (and knowing at the same time that my inhibitions will be lowered by this action). If this logic is applied to engineering, it stunts growth. If the development of A.I. systems may have negative consequences, why should an engineer take on the responsibility of such a feat? If I am the lead software engineer of an A.I. system that is used in self-driving cars, and an inconceivably low probability problem occurs that leads to the death of 10 people, in this line of thinking I should be responsible. I knew that such issues may occur, but I took on the task anyway. While these two examples are undeniably similar in form, there seems, to me at least, a large difference there (and even in the case that I didn't see a difference, holding these two examples in similar lights can, nonetheless, stunt technological growth). In attempts to create an inconceivably complex system, there has to be error. It is impossible for there not to be error. Should such a system not exist because there is no error free way of creating it? Should the head engineer take on the task anyway and face the repercussions if one of these errors occur? I think the answer to both of these questions is "Of course not." There are degrees of probability and scale that need to be taken into account. These two examples, while seemingly synonymous, are radically different.
- I left my gun on the floor and my kid shot himself.
- "Well, my kid is the one who shot himself, so there is no one to blame but himself."
- "I shouldn't have left my gun on the floor because it was possible that my kid may inspect the weapon and shoot himself."
- I left my fun on the floor and my roommate shot himself.
- "I left my gun on my floor, but he is an adult that has the cognizance to know that playing with a gun might not be a great idea.
- "While my roommate may have shot himself, I should not have left my gun on the floor and given him the opportunity to do so.
- Much like with the God example that Davis gave, these are all reasonable stances to give, right? Well, not really. I don't think any rational adult would leave a gun on the floor and blame a child for shooting themselves with it. This is due to the term "scale" that I mentioned earlier. There are scales of responsibility that we can hold. The child, in fact, is responsible for shooing himself under the idea of "blame" that Davis mentioned, but the child also didn't have the knowledge of cognizance to know that such actions wouldn't be ideal. The kid probably know that guns are bad and that they have the ability to harm people, but that wasn't necessarily enough to stop them because the kid didn't have the thought process of "I don't know how to operate a gun, and doing so may result in me harming myself or others." We can't expect a child to do that, and while comparing children to adult may seem odd, I think there is a comparison to be made nonetheless. With the extremely complex systems that goes into making something like a advanced self driving car, there are many hands that go into this process, but also there is different expertise that go into such a system. The mechanical engineers may not know exactly how the code works, and the software engineers may not exactly know the car works, but they have their respect responsibilities for what their role entails. There two parties can both acknowledge that something may go wrong, but they do their respective bests to make the system safe as possible. They may understand that problems may arise, just like the child, but they do that task anyway. Again, these are very similar examples that would likely lead to different logical processes when defining responsibility (even in the "Feinberg’s engineering" sense). The parent shouldn't have left the gun on the floor and give the suspecting child the possibility of picking up the gun, but in that same breath, the engineers should not have released that statistically safe system on the basis that due to a set of many causes. While the first statement makes sense, the second seems ridiculous and unsatisfactory under any condition as any condition can lead to failure.
- Davis sets a series of "different blames," and then fails to adequately address all of these blames while writing off the finale of the paper as "Engineers gain more than they lose while taking blame." Sure, Davis, in a world that is dictated by rules and laws, somehow there will be no negative repercussions and engineers can do about solving problems by analysis and reason. I really wish I could live in a world where such rationality exists in, but that is not how the current system is. If I used same disregard that Davis has for the reality of blame under the law, I would morally agree with him, but Davis is not making just a moral argument. He is making much more than just a moral argument. This moral argument cannot be extrapolated and used as a position on how engineers should take responsibility for their work in the real world.
Comments
Post a Comment