When I’m trying to persuade someone that people ought to concentrate on effectiveness when choosing which charities to fund, I sometime hear the worry that this sort of emphasis on cold calculation risks destroying the crucial human warmth and emotion that should surround charitable giving. It’s tempting to dismiss this sort of worry out of hand, but it’s much more constructive to address it head on.I also think it gestures at a real aspect of “EA cultural”, although the direction of causality is unclear. It could just be that EA ideas are particularly attractive to us cold unfeeling robots. a This situations happened to me today, and I struggled for a short and accessible response. I came up with the following argument later, so I’m posting it here.
It’s often noticed that many of the best surgeons treat their patients like a broken machine to be fixed, and lack any sort of bedside manner. Surgeons are also well known for their gallows humor, which has been thought to be a coping mechanism to deal with death and with the unnatural act of cutting open a living human body. Should we be worried that surgery dehumanizes the surgeon? Well, yes, this is a somewhat valid concern, which is even being addressed (with mixed results).
But in context this is only a very mild concern. The overwhelmingly most important thing is that the surgery is actually performed, and that it is done well. If someone said “I don’t think we should have doctors perform surgery because of the potential for it to take the human warmth out of medicine”, you’d rightly call them crazy! No one wants to die from a treatable appendicitis, no matter how comforting the warm and heartfelt doctors are.
Likewise for cold calculated charitable giving. I find it plausible that, if EA ideas were to become mainstream, there would be a degree of coldness engendered. But this is a very acceptable and small cost, and it’s furthermore one we can at least partially defend against. It would be a serious mistake for someone to think EA was a bad idea because of this risk.
(↵ returns to text)
- I also think it gestures at a real aspect of “EA cultural”, although the direction of causality is unclear. It could just be that EA ideas are particularly attractive to us cold unfeeling robots.↵