Data are not objective, and algorithms are not neutral

To claim that we live in an increasingly digital age is rather uncontroversial. Areas that until recently were considered hallmarks of the physical and analogue—such as various aspects of human emotions and social behaviour[AU1] —are rapidly becoming ‘datafied’ and brought into the digital realm.  

Utilising the ongoing process of datafication, digital platforms have become the infrastructure upon which our lives are lived. In these virtual spaces, commerce is facilitated, transportation is organised, romantic relationships are formed, and the pressing climate crisis is attempted addressed, among other notable examples. Against this backdrop, it is reasonable to expect that digitalisation will increase and not decrease.

This development puts pressure on the role of leaders as decision-makers. Human judgment is often pitted against rational analyses produced by neutral algorithms that crunch objective data. But this ostensive division is artificial. Because data are not objective, and algorithms are not neutral.

Data are not objective

In daily speech, data is often equated with objectivity. They are seen as two sides of the same coin. If this is true, it means that data exist outside context and can provide a universally true foundation for making decisions. The problem is just that the data are not objective. They are always produced about and within a context. This may sound abstract, but is really quite intuitive. Let me use an example to illustrate. To make decisions in an an enterprise, data is needed about the enterprise. And what data that are relevant about the enterprise is determined by an array of factors.

Enterprises are, for example, legally obliged to produce and publish financial accounts to provide internal and (predominantly) external stakeholders with information about their financial state. Production of other types of data—like customer data—is not legally required but may still be desirable if the decision-makers in the enterprise find them relevant and in line with goals and objectives.

But how do decision-makers get hold of relevant data about the enterprise? By constructing systems and organisational units within the enterprise that are going to produce them. Financial accounting data are produced by the accounting department, production data by the operational units, customer data by the marketing department, and so on. The point is this: data about a certain context only exists if anyone has decided that they are relevant, and if systems are established within the context to produce them.

The above example is fictitious but speaks to something general. It may, for example, be extended to data about national economies or the climate[AU2] , which are also produced about and within a context.

Since data are always inextricably linked to context, they cannot be objective.

Algorithms are not neutral

Similarly to data being equated with objectivity, algorithms are frequently equated with neutrality. They are assumed to be indifferent machines that impartially process data. The problem is that algorithms are not neutral. And primarily, this is due either to bias in the data sets that the algorithm processes or to the properties of the algorithm itself. 

Because data are produced about and within a context, they can be biased. These biases are deliberately or accidentally superimposed but may nevertheless be reinforced by an algorithmic agent. The problem with algorithmically reinforced data-based biases, which, for example, may result in discrimination against vulnerable groups, is well documented[AU3] .

But bias may also be due to the algorithm[AU4] . Many algorithms are for instance, are designed by humans. This means that subjective choices are made in the development phase that influence how data are processed. One example can be a trading algorithm where the specific risk appetite of the investor is programmed into the model.

Different forms of statistical bias may also influence algorithmic neutrality. This is particularly noticeable in recommender algorithms on digital platforms like Amazon, Spotify and Netflix. Such algorithms are shown to frequently recommend popular items more often than others (‘popularity bias’)[AU5] , and hence do not give a neutral presentation of available choices.

Implications

The implications of data not being objective and algorithms not being neutral may be obvious: we cannot blindly trust either of them. The global financial crisis in 2008—triggered in part by analysts blindly trusting their risk models[AU6] —is a case in point.

The non-objectivity of data and non-neutrality of algorithms also mean that the ostensible division between unreliable human judgment and rational machines is artificial and should be discarded as a premise for discussions about how man or machine should dominate in decision-making processes.

 The premise rather needs to be that humans and machines have complementary strengths and weaknesses that can feed into each other in a decision-making process. And leaders that foundationally understand data and algorithms —including their lack of objectivity and neutrality—will have an advantage in an increasingly digitalized world.

 [AU1]https://www.wired.com/story/artificial-emotional-intelligence/

 [AU2]https://data.worldbank.org/

 [AU3]https://hbr.org/2013/04/the-hidden-biases-in-big-data

 [AU4]https://hbr.org/2016/12/hiring-algorithms-are-not-neutral

 [AU5]https://link.springer.com/article/10.1007/s11257-024-09406-0

 [AU6]https://www.linkedin.com/pulse/when-all-models-fail-lessons-from-2008-financial-crisis-cecconi-puhyf/

Andreas Ulfsten

Currently a diligent Postdoctoral Fellow at the Department of Informatics at the University of Oslo, he holds a PhD from the Norwegian School of Economics and Business Administration in Bergen. Having a particular interest in digital technology and organizations, he specifically focuses on how public and private organizations develop and operationalize data strategies for management.

Next
Next

Christian nationalism is gaining ground in the United States