|Humanity is besieged with the mounting dissonance of criticisms and shame: manipulations, depressions, polarization, politicking, vanity, lies, and addiction. / Photo by: Fabio Formaggio via 123rf|
The term “humane technology” seems so hazy that it draws a lot of versions as to its meaning. Taking off from Oxford’s definition of “humane” that says it’s “having a civilizing effect on people,” humane technology then can be interpreted as technological developments that put compassion and concern for people. Would it be correct then to say that the guillotine and the electric chair are humane technologies since they quickly execute the deserving?
The persistence of good and evil remains. We are left to choose one of two paths: a nightmare of unchecked technological power that dictates or a healthy relationship between humans and technology that facilitates.
World-renowned design ethicist Tristan Harris claimed that while technology has been fervently and profitably upgraded, humanity has been downgraded. Technology is driving people toward more radical opinions to reach common views with their attention spans getting shorter. Technology deteriorated humanity.
Humanity is besieged with the mounting dissonance of criticisms and shame: manipulations, depressions, polarization, politicking, vanity, lies, and addiction—all of which are interconnected harms of human degradation. More than two billion people are enthused into social media platforms with end goals not only of getting attention but also getting addicted to obtaining attention from others. Outrageous and extreme topics flourish now to keep people glued to tech sites nourished by advertising. By exploiting humanity, technology preys on human weaknesses, downgrading wellbeing while upgrading machines.
This is appropriately illustrated in the following instances:
- Extremism abuses brains. Seventy percent of the more than a billion hours spent on YouTube daily come from the recommended system using keywords like demolish, obliterate, debunk, dismantle, rip, destroy and the like.
- Outrage abuses brains. According to the Proceedings of the National Academy of Sciences, retweet for every moral-emotional word tweet is raised by 17 percent.
- Insecurity exploits brains. YouTube’s algorithm in 2018 recommended anorexia videos on teenage girls on diet as these catch attention.
- Conspiracies exploit brains. If watching a NASA Moon landing, YouTube would recommend Flat Earth conspiracies millions of times. The InfoWars of Alex Jones was recommended by YouTube 15 billion times.
- Confirmation bias exploits brains. Fake news is unconstrained whereas real news is limited to what is true, resulting in fake news spreading six times faster than real news, according to an MIT Twitter study.
An uncontainable digital Frankenstein has been created by social media. The parasitic tech platforms pose an urgent threat to society. Technology’s exploitation of human weaknesses has worsened from addiction to fakes, and the capacity to understand the world and to act together is plummeting downward. Humanity needs to change its course.
Transition to Humane Technology
Human downgrading is catastrophic. There is a need to design systems that could protect humans to avert downgrading. In order for technology to thrive, six principles are recommended in developing humane technology.
1. Humane technology should feel natural, not distant.
2. Humane technology should reinforce instincts and perceptions.
3. Humane technology holds human values as its foundation.
4. Humane technology resonates with the human senses.
5. Humane technology should empower people.
6. Humane technology must improve the human condition.
So What Can Be Done?
Not everybody is in a position to create change in the tech industry. Nevertheless, you can start a discussion about tech ethics. Teach people around you the usual dangers of the products used every day. Influence and urge them to monitor social media usage and direct them to resources where they can learn to share, and experience life together.
On the other hand, if in a position of power in a tech company, establish an ethics board to provide guidelines in developing humane technology. Formulate ethical policies, operational guidelines, and organizational incentives to guide operations with an emphasis on social values. Use as references the Ethical OS guide, CHT’s (Center for Humane Technology) Humane Design Guide and article on how to reverse the human downgrading trend.
In addition, production teams at tech companies can incorporate into products a humane social system design that protects human vulnerabilities.
Google and Apple as influential tech gatekeepers can reshape app stores and business models to compete for consumers’ trust. Shareholders can demand commitments from companies to shift away from business models that have huge potential risks. Instead, they can encourage the production of business models that are sophisticated but humane friendly.
|Influence and urge people to monitor social media usage and direct them to resources where they can learn to share, and experience life together. / Photo by: Cathy Yeulet via 123rf|
The following are also some top points to remember:
• Activate insight on potential effects. Think and imagine several scenarios that can possibly happen including consequences. Point your imagination in the right direction.
• Check the product and test it for the eight risk zones signals. (Ethical OS, a new guidebook published to help Silicon Valley tech companies navigate the ethical implications of their products and services, provides a checklist for doing this). Think of ways on how to mitigate the risks. Discuss with friends feasible actions to undertake.
• Give a damn. Show care and interest in perceived outcomes. Be keenly aware of how people are manipulated by technology. Share awareness with friends and other people around you.
As parting words, think of your loved ones. What kind of future awaits them in this present world of technology? Isn’t it better to have happy, healthy consumers and communities? Isn’t it best to have humane technology?