You can’t bureaucratize corporate morality with IT automation

If it can be argued that corporations can have “a mind of their own,” is it possible that they can act morally and be held morally responsible for the actions of their employees? Furthermore, can corporations demand that employees follow a very strict code of conduct to ensure good morality?It can be hard to know what role moral considerations actually play in a corporation’s or employee’s choices because their behaviour can be a function of countless known and unknown factors. It is possible that when some make decisions, moral considerations are completely idle.

Some may argue that what is moral or not reflects the personal or individual perspective of a person or corporation. The legality or illegality of drugs in the United States is a perfect example of moral confusion. “Moral” motivations exist on both sides. So morality, it seems, can be flexible.

The question is, can corporate accountability be flexible enough to the point that it may adversely affect the IT operations and well being of both the corporation and employees?

The emphasis on stricter accountability that is now occurring in the corporate world was an understandable response to some sickening bookkeeping scandals. This notion would never have evolved from a buzzword into the focus of voluminous legislation, however, if the governments had not also been lured by the myth of precision: Because accountability suggests that there is a right and a wrong answer to every question, it flourishes where authorities can measure results exactly.

This stricter accountability whispers two seductive lies to society: Systems go wrong because of individuals (including computer systems); and the right set of controls will enable society to prevent individuals from creating disasters. This type of accountability is a type of superstitious thinking that allows society to live in a state of denial about just how little control we have over our environment and actions.

This notion of accountability assumes perfection. If anything goes wrong, it’s a sign that the system is broken. It can be blind to human nature. For example, it assumes that if corporations are being watched, or have IT systems to monitor them, they will not do wrong. This seriously underestimates multi-faceted human minds and motivations.

It bureaucratizes and atomizes responsibility. While claiming to increase individual responsibility, it drives out human judgment. When a sign-off is required for every step in the workflow, those closest to a process lack the freedom to optimize or rectify it. Similarly, it assumes that an individual’s laxness caused a given problem. If somebody hadn’t been asleep at the switch, gotten greedy or assumed that somebody else would clean up the mess, none of this would have happened.

Finally, this tighter version of accountability tries to squeeze centuries of thought about how to entice people toward good behaviour and dissuade them from bad into simple rules by which individuals can be measured and disciplined. It would react to a car crash by putting stop signs at every corner and policemen at every corner.

Bureaucratizing morality by increasing the level of automation to a complex organization gives society the sense that society can exert close control. But grown-ups prefer clarity and realism to happy superstition.

Would you recommend this article?


Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.

Jim Love, Chief Content Officer, IT World Canada

Featured Download

Related Tech News

Get ITBusiness Delivered

Our experienced team of journalists brings you engaging content targeted to IT professionals and line-of-business executives delivered directly to your inbox.