Even though there are many ways to define character education, for the moment let’s assume that Wikipedia provides a reasonable starting point: “…an umbrella term loosely used to describe the teaching of children in a manner that will help them develop variously as moral, civic, good, mannered, behaved, non-bullying, healthy, critical, successful, traditional, compliant or socially acceptable beings.” While there is plenty in this definition to inspire healthy debate, my primary concern with it is this: it assumes character development only applies to human beings.
The reality is that we are now infusing artificial
intelligence into most things that we make. The more
complex our machines become, the more their decisions begin to look like ethical judgments and, ultimately, expressions of character. Consider the case of self-driving cars that I pose to my media psychology graduate students.
Cars and Character Education
Imagine you are driving down the highway in the family SUV, your two children and the dog in the back seat. Suddenly, a deer jumps out in front of your car. You can:
1) Jump the curb and hope you don’t hurt your passengers, as well as two people who are walking their dog on the sidewalk.
2) Hit the deer, knowing that doing so would probably injure or maybe even kill you (not to mention the deer), your passengers and anyone in the cars behind you who swerve to avoid the accident.
3) Cross into oncoming traffic and take a chance you can outmaneuver all the cars headed straight for you.
A decision needs to be made in a split second.