본문 바로가기

Short Essays

From a machine learning novice:

After working with machine learning, deep learning, neural networks, or whatever you want to call it, I've grown - as any person who knows little about something, but enough to fake it would be - skeptical and confused. I preface this rant by saying that this only applies to machine learning applications in engineering.

Normally, a good algorithm ( or a simulation, in the engineering sense, I guess) is one that encapsulates the science, the logic behind it. For example, a physics simulation code would take into account the various physical effects that is at play in the control volume. However, such simulations are sometimes incredibly costly to create, take a long time to run, and usually require a learning curve to use or understand its output. 

Data driven regression / `surrogate' models (a.k.a. ``machine learning algorithms''), however, skip the steps of doing all the physics and directly relate the perceived realities (data). It basically recognizes and `learns' the pattern and relationship between the input data and the output data, to be able to predict the output given the input. These `innovations' of machine learning are praised and is applied to every realm of analysis, from trying to read people's minds, to modeling nuclear reactors (to a degree). However, as anybody who has had experience with machine learning would say, these models are usually not reliable or robust enough to replace the logic(physics)-based models, and the output of these data-driven models should definitely be taken with a grain of salt. Among many things that can go wrong is when these `artificial intelligence' is trained with bad, or an insufficient amount of, data.

On the surface, it may seem like the machine learning model knows what it is doing. Generally, they'd always give you an answer. But that answer can be incredibly wrong, but we wouldn't have any idea why. Neural networks especially are `black boxes', where an input goes in and an output comes out. There's no lengthy logical calculations or equations. It's like the kid that you're babysitting, who, when asked why he is angry, aggressively proclaims `BE.CAUSE'. You just have to take his word for it, and there's nothing you can do to inquire further.

Such may also be the case with people. Some people study law, politics, history, and science to construct their expert model that uses the laws of reasoning and science to produce an output. They also accumulate abundant, diverse data that withstood the test of time. On the other hand, some people are merely `trained' with the limited data they have from unvetted sources, and come up with their own output, or an opinion, that their `neural network' produced. When asked how they come up with the output, answers generally do not stray far from a coincidental encounter of a sliver of information.  From a logical standpoint, it would be ludicrous to give both opinions equal weight.

The lack of good, diverse data is a huge problem. The advent of social media, which was supposed to solve this problem, is only making it worse. People are becoming more and more extreme. It's rather comical - a machine learning algorithm from Facebook determines what you `like', and shows you more things that you `like'. For example, if you're a liberal, you'll mostly see liberal news and content. Same goes for conservatives, Christians, LGBTQIA folks, incels, whatever. This curation (or should I say the c word.. censor?) of like-minded data reinforces our previous training of OUR neural network deeper (e.g. ``I found a lot of videos on Facebook why immigrants are ruining America''), and actually makes it worse, since we are only exposed to a single facet of the data pool - or way things are (or, are perceived). So consequently, one machine learning algorithm is making another machine algorithm (our brain) a lot less robust/useful. Fuck me, right? When someone says that Artificial Intelligence will end humans, we tend to imagine robots with machine guns or a rogue software shutting down our power grid. But I think this is it - by simply giving us only what we want (to see or believe), and robbing us of the humbling experience of admitting wrong, we are becoming less human. We become the satisfied fool in the Socrates' quote. 

In the world of convenience and democratization of information and opinion, I am afraid to say that we are fucked.  In pursuit of convenience, we pursue the final data instead of learning about how it was made, or the methods used. We became dataists, where we believe that data alone can enlighten and save us. Would you believe in a Florida Man who claims he is the son of God without being skeptical? Because some of the data you might encounter can be as sketchy as the that dude with two different crocs in each foot. What I'm trying to say is that we should encourage skepticism and exploratory attitudes, where people look at a phenomena or data and want to see if anything contradicts it. But we don't, because we just don't have time - there are netflix shows to watch, and there are white claws that need consumption. Or, we are just too set in our own belifs that when we encounter anything that contradicts our beliefs, we render it as nazi propaganda. If would be a nazi thing to say that we should discredit and silence the people who do not have the credentials, or have `inadequate training' (both exposure to data and reasoning). I mean, I guess we do it in a personal level, but if we do it systematically, that'd be weird. 

It's a fucking whirlwind out there. In this society characterized by postmodernism, everybody's a fucking expert and a critic - even the neural networks trained by 4 datasets from twitter. We use memes, quite possibly the most condensed medium of expression, rather than tell stories, addicted to how quickly we can deliver and receive, and quite promptly thereafter forget/delete our feelings and ideas. Everything is ephemeral, and absurd.

But there might be a positive way to spin this. Why would my strong belief of inevitable doom of humankind and absurdism prevent you from making the best of whatever is left? As any cliche and amateur writer would do, I'll compare life to the most archetypal journey to hopefully make a point - Odysseus' journey back to Ithaca. Except that there's no Ithaca, and Penelope (your wife) is remarried to be the seventh wife of a Saudi royalty. What would you do in this situation - give up and jump into the water and kill yourself? That would make a terrible story. But even without the happy ending of the Odyssey, it still is a good story - the journey itself is pretty fun. So tie yourself up to the mast, and hear the Sirens sing.

 

'Short Essays' 카테고리의 다른 글

YOLO  (0) 2020.12.11
Quarter Life Crisis  (0) 2019.12.27
On Political Comedy Shows  (0) 2019.09.08
On Dorian Gray  (0) 2019.09.08
Mother  (0) 2019.09.02