Langdon Winner's "Do Artifacts Have Politics"


  • While I have not done specific research on the politics of technology, I have a strong notion to disagree with Winner in the introduction of this article. Artificial intelligence is something that has had a lot of political/ethical debate surrounding it, and rightfully so. Even if the true impact of "super-intelligent" (a broad term used to describe A.I. systems that surpass or simulate human level intelligence—which are, at the very least, not in the foreseeable future). A.I. is not known currently, the varying hypothesis around such impact is infinitely fascinating. I stand on the side of, what I like to call, "singularity skepticism." Which, in short, is a skeptical view of the hypothesized impact on singularity and occurrence. Most qualified computer scientist are found on this side, but the general population are overwhelmingly afraid of A.I. systems, both super-intelligent and simple. If I were to give a simple statement on my views of ethical consideration for A.I., I would claim that super-intelligent A.I. systems should not have the same moral consideration that humans have due to them having only simulated sentience; however, in the case that A.I. becomes a biological process of simulation, which it will hypothetically become, then this might change my view. All of this conversation falls under the vile of what Winner is talking about, so I find myself highly interested in this article. That said, regardless of what politics are given (or inherently had, depending on your view), I have a very strong aversion to halting technological progress on the basis of morals. If I were to change my views to be similar to Winner's, I cannot conceive conceding that stance.  
  •  All these damn philosophers talk about civil engineering!!! There are other fields of engineering that don't revolve around the civil development!
  • I have a weird relationship with arguments that revolve around intent vs conclusions. In situations like literacy tests for voting, I surly don't want the outcome of voter suppression for the disenfranchised; however, I intrinsically dislike ascribing intent that is not inherently included in the outline of the policy (or whatever else may apply). Even if racist intent is nearly certain, I still have moral issues regarding "nearly." I find myself constantly grappling with how I should go about morally understanding issues of this nature, and I am currently torn. If only racist people would just be more direct with their racism so nice people can ostracize them without having to arbitrarily define their racist intent. 
  • I agree with Winner in his idea around the ethics of technologies that have a nature of oppression that I don't want them to exist, but, as I said, I don't want to ascribe intent that is not there, and I am not comfortable saying the only thing that matters are outcomes because, by nature of that, anything can be morally neutral as so long as no outcomes exist yet. 
  • When technologies regarding agriculture began to develop quickly, there was a series of outcries that came from the workers that such technologies replace. This a statistic that I read a while ago, but nonetheless this is a very close representation of it: Just one hundred years ago, agriculture was 96% of the job market; however, now, agriculture takes up only 2% of the job market, and we stand at a very similar, if not smaller, unemployment rate then we did then. Image this same phenomenon with Uber/Lyft and the implementation of self driving cars? Is it immoral to advance technology even if it steps on the toes of the job market (admittedly much less so, but the idea stands)? I would say no. There are trends in the labor market that shows that regardless of what technologies may arise, the labor market will still exists. Even if such a market will change in the industries that dominate it, it will nevertheless still exist. Software engineering, for instance, is a career path that didn't exists at the start of my grandparent's lives (and nearly my parent's lives), and without the advent of the computer, no such career would exists. On that same note, however, this industry is threatening the automation of nearly most jobs that make up the current job market. As I stated earlier, I think everything takes a backseat to technological developments and the stunting of technological developments undermines mankind as an idea. 
    • In modern times, no one would argue that the advent of technologies such as the printing press or farming machines, but such arguments existed at the time of their developments. Socrates, undeniably one of the greatest philosophers of all time, himself said the advent of writing would stunt intellectuals and create forgetfulness. Humans are shortsighted and very bad at predicting the future, and halting technology by human whim seems to doom us to failure or stagnation. Imagine world where we didn't write out ideas down. No thanks, Socrates. Let technology define the course of modern history—by nature of development and evolution, the best will strive and the worse will fall. 

Comments

Popular posts from this blog

Utilitarianism (Rachels, Chapter 7 & 8)

A Response to Mina Kimes' "The Sun Tzu at Sears"

The Altered Nature of Human Action (Jonas Reading)