EUT logo
On the Exactitude of Big Data: La Bêtise and Artificial Intelligence

This article revisits the question of ‘la bêtise’ or stupidity in the era of Artificial Intelligence driven by Big Data, it extends on the questions posed by Gille Deleuze and more recently by Bernard Stiegler. However, the framework for revisiting the question of la bêtise will be through the lens of contemporary computer science, in particular the development of data science as a mode of analysis, sometimes, misinterpreted as a mode of intelligence. In particular, this article will argue that with the advent of forms of hype (sometimes referred to as the hype cycle) in relation to big data and modalities of data analytics there is a form of computational stupidity or functional stupidity at work. The exaggerated promises of big data to solve everything are overblown expectations which will lead ultimately to a form of disillusionment with data science. This can be seen in a number of domains, for example smart city technologies, the internet of things, and machine translation. In addition to the negative effects of exaggerated claims of Big Data is the possibility that societal norms will facilitate Big Data technological change by incorporating the bêtise of Big Data, thus leading to a change in our relationship to technology, examples of this would be privacy standards and ownership of data. This paper will conclude by setting out the analysis some of the limitations of Artificial intelligence and Big Data in order to allow a re-examination of the claims made.

Authors

Noel Fitzpatrick, John D. Kelleher

Download

On-the-Exactitude-of-Big-Data_-La-Betise-and-Artificial-Intellige-1.pdf


EUT logo