Slaughter of the Innocent

There is a lot of talk of gun control after the recent school shooting in Connecticut. I consider “nonviolence” to be the term that most closely aligns with my political views. Why then do I generally disagree with gun control laws?

First, there are the practical problems:

  1. 3d printed guns are not far off. Are we going to outlaw 3d printing? Or try and regulate access to the plans? With anonymity networks like tor and other freedom-enabling technologies, it’s going to be difficult to effectively control access to weapons in the near future, without a level of surveillance that most people would find 1984-esque.
  2. Even if you do keep guns away from people, they will just build bombs. If somebody really wants to kill people and make a big stink, they can always build a homebrew bomb.

So we can try and keep the guns away from the killers, but will it actually be effective at what it seeks to do? Maybe somewhat, and for a short while. But is it worth it given the tradeoff in terms of our freedom?

Often when somebody uses their freedom to do something we don’t like, we are tempted to say “people shouldn’t have that freedom.” But if we follow this to its logical conclusion, we wind up living in a world where the lowest common denominator dictates the freedoms we all have.

Outlawing guns is kind of self-defeating. How is such a law going to be enforced? Ultimately, with guns. So what we’re saying is that we want to allow a certain class of people — those in the government with the blessings of the majority (at least ideally) — access to guns, but not ordinary citizens.

I reject the idea that the individuals within our government are any more responsible to carry a gun than the average citizen. As evidence for my position, I give you the fact that our government is regularly engaged in the killing of innocent children overseas. I don’t think the government has demonstrated enough restraint with their use of guns, so my stance is this: you can take away guns from the people only after the government has also laid theirs down.

Finally, there are spiritual problems with the whole argument of gun control vs gun freedom. It avoids the underlying question of why we want to kill each other in the first place. The only real solution is one which addresses why there is so much violence in our culture to begin with. This answer can not be understood without a critical look at our plates.

To avoid the innocent being senselessly slaughtered, we must stop senselessly slaughtering the innocent.”

getting what you want, robot-style

The robots are coming! And it’s going to be awesome.

Robots means we get to automate away a lot of our problems. I like automating. It frees us up from tedius work. I envision a future in which all the unpleasant work is done by robots, and we are free to spend our time on other pursuits; art, family, intellectual pursuits; heck, whatever we can dream up!

However, as our society becomes more automated, we need a stricter sense of ethics. There are three reasons this is more important than ever: automation scales, automation is fast, and automation can be powerful.

Automation scales

Whatever kind of automated system we create, we’re gonna get lots of.. whatever it does. So we’d better think hard and make sure that whatever it does is something we want to be a part of.

A fine example of this is factory farming. As we increase the efficiency and automation of raising and slaughtering animals for food, we have created an environment that would repulse even the strongest stomach. Almost everyone would agree that there’s got to be a better way.. and many are starting to search for that way. Some want to back off from the capitalist progression towards increased efficiency (and automation!) in terms of where our food comes from. Others (like me) think that’s a losing battle, and that the factory farm is simply showing us, in a concentrated fashion, what necessarily goes into a meat-based meal.

Automation is fast

Whatever we automate tends to get faster and faster at whatever it does. This means if we screw up the design, we could screw things up really fast — without time to intervene manually to fix it.

The perfect example of this is in finance. I used to work at a high-frequency trading company, where we wrote programs whose task was to exploit tiny market inefficiencies as fast as possible. We, together with other companies like us, automated away the work of many thousands of market traders. In 2010 there was a “flash crash” which was widely attributed to some particular funny interaction of — or bug in — robotic trading machines. The flash crash was a mini market crash that took place so quickly it would have been difficult for a person to evaluate the situation and react thoughtfully. Fortunately, most firms and exchanges have the forethought to engineer failsafe checks onto their robots that do things like halt trading when something seemingly unexpected occurs.

All this is simply to say that as we automate and increase the speed of our financial system, we are going to get results faster — so whatever direction it’s headed in had better be a good one. Once again I don’t oppose the automation per se, but I think it provides us a good opportunity to look deeply at our entire financial system and ask if it’s on solid legs. Because if it’s not, we’re bound to find out soon, and we may not have time to react.

Automation can be powerful

In my third example, automated killing robots (a.k.a. “drones”), the relationship between the ethics of our decisions and the reality on the ground is perhaps the most direct.

Automating war has been done for a long time. Warmakers often get access to technologies first because it sure is better to be the team that has airplanes than the team that doesn’t. However, until now, we’ve always had to risk losing lives when we went to war. Now, we don’t have to risk anything except a few (expensive) hunks of metal and computer parts.

Whether or not you think drones are a good idea, it’s unlikely that they are going away anytime soon. On the contrary, China and other nations are working overtime to create their own drone fleets to match ours.

The concept of drone wars isn’t all bad — if both sides are fighting with drones then nobody gets hurt, after all. What automation brings to the table is, once again, the necessity to look really hard at the ethics of each decision that goes into the drone war. Is this war truly being fought for just reasons? Because it’s easier than ever before for the military to go to war when it doesn’t have any weeping mothers and body bags to atone for. At least, not on the winning side. And perhaps it seems far-fetched to some in the U.S. right now, but in some parts of the world there is always the possibility that drones might be used to scare the populace of a country into supporting one leader or another.

Ultimately, somebody has to write the rules for how the robots act. In the case of drones, perhaps it’s constitutional law that ultimately governs the robots. In the case of markets, it’s financial companies, exchanges, and possibly regulators. And in the case of factory farming, it’s ultimately the consumer who decides what the history of their meal is going to be. In all cases, automation acutely brings to a head the need to look hard at the direction the automation is going and make sure it’s a world we want to live in. In a real sense, we need to be “careful what we wish for” because we are getting more of it, faster, every day.

It’s either that, or we may have to fire the robots. And I, for one, am looking forward to robotopia!