top of page
Search
  • Writer's pictureNancy Nemes

Classic Math Vs Artificial Intelligence

The transition from a “function-centric world” to a “data-centric world”


By guest author: Ph.D. Donata Petrelli (Italy), Ms. AI Expert


From the primitive shepherds, through the Egyptians, Sumerians and Babylonians, then through the Indians, Greeks and up to the first European mathematicians of the 18th and 19th centuries, people's primary objective has always been "to measure the world".


Having certainties and points of reference allows humanity to continue in its development. Values that can be confirmed or refuted only after decades of trials and tests of validity.


The important thing has always been to find logics to the various phenomena of reality that allow to extract information and answers.


However, there is a problem. While we observe, hypothesize, test, confirm or refute… in the meantime the world evolves and changes. New discoveries and innovations broaden our world, deepen it and find new questions about the answers we have just found.


Photo credits: Franki Chamaki on unspalsh

All this is the cause and, at the same time, the effect of scientific, technological and social development.


The basic problem is always to find the right tools to interpret it!


The change leads to a new interpretative paradigm that in turn brings new methods of "measurement".


Over the decades we have moved from a classic inductive to a modern deductive approach and with it we have new instruments and scientific methods that represent it.


So from a "demonstrative" approach where the data adapts to the function to demonstrate the thesis, we are now in a totally reversed phase in which there is too much data and we have to find a way to interpret it.


Classical mathematics responded to the first phase fully where the mathematical function was the solution. The data was explained by it. The concepts of domain and codomain totally reflect this function-centric world view. Data that cannot be interpreted by the function is discarded, it is out of the domain. The solutions are in the codomain, others are not considered.


For a time we have been satisfied: we analyze the phenomenon and then find a function that represents it as faithfully as possible. In this scenario we find the data sets suitable for this function.


Today there is Big Data. Too much data, too many different patterns ... what function to use? Now the data is at the center of the universe, scientific analysis is its servant. We do not make preliminary hypotheses, we let Big Data speak.


Artificial Intelligence algorithms meet this new data-centric world view. Doing research with Artificial Intelligence mainly means finding suitable algorithms that lead to the optimal response. It is the algorithm that now adapts to data and there is no more domain of existence and codomain. The more data you can process, the more credible the solution will be.


Throughout this extraordinary evolutionary process there are always mathematical models, abstractions of reality that allow us to find the logic underlying it.


In classical mathematics, the logic of models was cause-and-effect. Finding a representative model of reality meant finding the function that related precisely the causes with the effects.


Now the model follows the logic of the data, adapts to it, modifies itself through it and improves with the growth of it. We are in the age of intelligent machines that aren’t programmed to solve but that learn to solve, from experience and from errors.


The only certainty today is that everything evolves and nothing remains like yesterday. We are only at the dawn of the age of Artificial Intelligence... the “Euclidean Artificial Intelligence” :)

461 views0 comments
bottom of page