Algorithms Are Instructions for Problem Solving
We live in a world where computers are only vaguely understood, even though they permeate every moment of our lives. But there is one area of computer science where anyone can understand the basics of what’s going on. That area of computer science is called programming.
Programming isn’t glamorous work, but it’s the foundation of all computer software, from Microsoft Office to robocallers. And even if your knowledge of programming stems solely from bad 90’s movies and off-beat news reports, you probably don’t need anyone to explain to you what a programmer does. A programmer writes code for a computer, and the computer follows the instruction of that code to perform tasks or solve problems.
Well, in the world of computer science, an algorithm is just a fancy word for code. Any set of instructions that tells a computer how to solve problems is an algorithm, even if the task is super easy. When you turn on your computer, it follows a set of “how to turn on” instructions. That’s an algorithm at work. When a NASA computer uses raw radio wave data to render a photograph of outer space, that’s also an algorithm at work.
The word “algorithm” can be used to describe any set of instructions, even outside the realm of computing. For example, your method for sorting silverware in a drawer is an algorithm, as is your method of washing your hands after using the bathroom.
But, here’s the thing: These days, the word “algorithm” tends to be reserved for some very specific tech conversations. You don’t hear people talking about “basic mathematics” algorithms or “MS Paint graffiti tool” algorithms. Instead, you hear Instagram users complaining about friend suggestion algorithms, or privacy groups bashing Facebook’s data collection algorithms.
If “algorithm” is a catchall term for computational instructions, then why do we use it almost exclusively to describe confusing, magical, and evil aspects of the digital world?
Most People Use “Algorithms” and “Machine Learning” Interchangeably
In the past, programmers and pop culture referred to most computational instructions as “code.” This remains true today, for the most part. Machine learning is the big, cloudy area of computing where we tend to use the word “algorithm” instead of “code.” This has, understandably, contributed to the confusion and unease surrounding the word “algorithm.”
Machine learning has been around for a long time, but it’s only become a large part of the digital world in the last 15 or so years. While machine learning sounds like a complicated idea, it’s pretty easy to understand. Programmers can’t write and test specific code for every situation, so they write code that can write itself.
Of course, machine learning isn’t all fine and dandy. The name “machine learning” sounds creepy enough to make some people uncomfortable, and some of the popular uses for machine learning are ethically questionable. The algorithms that Facebook uses to data-mine or users across the web is an unflattering example of machine learning.
In the press, you’ll hear about “Google’s algorithm” for ranking search results, “YouTube’s algorithm” for recommending videos, and “Facebook’s algorithm” for deciding which posts you see in your timeline. These are all subjects of contention and debate.
RELATED: The Problem With AI: Machines Are Learning Things, But Can’t Understand Them
Why Algorithms Are Controversial
Long division is a familiar algorithm (among many others) for dividing numbers. It’s just that it’s carried out by schoolchildren instead of computers. Your Intel CPU uses a different algorithm entirely when it divides numbers, but the results are the same.
Speech-to-text generally uses machine learning, but no one talks about the speech-to-text “algorithm” because there is an objectively correct answer every human can instantly recognize. No one cares about “how” the computer figures out what you said or whether it’s machine learning or not. We just care whether the machine got the right answer.
But other applications of machine learning don’t have the benefit of having a “right” answer. That’s why algorithms have become a regular subject of conversation in the media.
An algorithm for sorting a list alphabetically is just a way of accomplishing a defined task. But an algorithm like Google’s for somehow “ranking the best websites for a search” or YouTube’s for “recommending the best video” is much vaguer and doesn’t accomplish a defined task. People can debate whether that algorithm is producing the results it should, and people will have different opinions on that. But, with our alphabetical sorting example, everyone can agree that the list ends up sorted alphabetically as it should. There’s no controversy.
How Should We Use the Word “Algorithm?”
Algorithms are the basis of all software. Without algorithms, you wouldn’t have a phone or computer, and you’d probably be reading this article on a piece of paper (actually, you wouldn’t be reading it at all).
But, the general public doesn’t use the word “algorithm” as a catchall term for computer code. In fact, most people assume that there’s a difference between a computer code and an algorithm—but there isn’t. Because of the word “algorithm’s” association with machine learning, its meaning has become foggy, yet its usage has grown more specific.
Should you start using the word “algorithm” to describe even the most trivial pieces of computer code? Probably not, as not everyone will understand what you mean. Language is always changing, and it always changes for a good reason. People need a word to describe the confusing, opaque, and sometimes dubious world of machine learning, and “algorithm” is becoming that word—for now.
That being said, it’s good to keep in mind that an algorithm (and machine learning) is, at its core, a bunch of code that’s written to solve tasks. There’s no magic trick; it’s just a more complicated iteration of the software with which we’re already familiar.
Sources: Slate, Wikipedia, GeeksforGeeks