Is any new technology good news or bad news? How would we know anyway? What anchorpoints do we need for the ethics of Tech? Here is a toolbox of critical questions to ask of new tech, produced by one of the world’s leading Christian Ethicists, Jacques Ellul. Many thanks to John Halton for the link:
76 Questions to ask of any new technoogy
by Jacques Ellul
What are its effects on the health of the planet and of the person?
Does it preserve or destroy biodiversity?
Does it preserve or reduce ecosystem integrity?
What are its effects on the land?
What are its effects on wildlife?
How much, and what kind of waste does it generate?
Does it incorporate the principles of ecological design?
Does it break the bond of renewal between humans and nature?
Does it preserve or reduce cultural diversity?
What is the totality of its effects, its "ecology"?
Does it serve community?
Does it empower community members?
How does it affect our perception of our needs?
Is it consistent with the creation of a communal, human economy?
What are its effects on relationships?
Does it undermine conviviality?
Does it undermine traditional forms of community?
How does it affect our way of seeing and experiencing the world?
Does it foster a diversity of forms of knowledge?
Does it build on, or contribute to, the renewal of traditional forms of knowledge?
Does it serve to commodity knowledge or relationships?
To what extent does it redefine reality?
Does it erase a sense of time and history?
What is its potential to become addictive?
What does it make?
Who does it benefit?
What is its purpose?
Where was it produced?
Where is it used?
Where must it go when it's broken or obsolete?
How expensive is it?
Can it be repaired?
By an ordinary person?
What values does its use foster?
What is gained by its use?
What are its effects beyond its utility to the individual?
What is lost in using it?
What are its effects on the least advantaged in society?
How complicated is it?
What does it allow us to ignore?
To what extent does it distance agent from effect?
Can we assume personal, or communal responsibility for its effects?
Can its effects be directly apprehended?
What ancillary technologies does it require?
What behavior might it make possible in the future?
What other technologies might it make possible?
Does it alter our sense of time and relationships in ways conducive to nihilism?
What is its impact on craft?
Does it reduce, deaden, or enhance human creativity?
Is it the least imposing technology available for the task?
Does it replace, or does it aid human hands and human beings?
Can it be responsive to organic circumstance?
Does it depress or enhance the quality of goods?
Does it depress or enhance the meaning of work?
What aspect of the inner self does it reflect?
Does it express love?
Does it express rage?
What aspect of our past does it reflect?
Does it reflect cyclical or linear thinking?
Does it concentrate or equalize power?
Does it require, or institute a knowledge elite?
It is totalitarian?
Does it require a bureaucracy for its perpetuation?
What legal empowerments does it require?
Does it undermine traditional moral authority?
Does it require military defense?
Does it enhance, or serve military purposes?
How does it affect warfare?
Is it massifying?
Is it consistent with the creation of a global economy?
Does it empower transnational corporations?
What kind of capital does it require?
Is it ugly?
Does it cause ugliness?
What noise does it make?
What pace does it set?
How does it affect the quality of life (as distinct from the standard of living)?
I don't supppose any conceivable technology is going to pass the test 76/0 — Does the technology of the wheel, for example, undermine conviviality or enhance it? But these are interesting dimensions of any tech, and set an agenda for understanding what we are letting ourselves in for before we get there.
I’d also want to point out that these questions are of far more limited usefulness as pass/fail things, than with the word “How” added to the beginning of most lines. In other words it’s a tool for assessing the probable qualitative impact, not just deciding whether to stick your thumbs up or down. This is useful because experience indicates that most of the time, your thumbs down won't matter anyway...