Demystifying Algorithms: Understanding the Math that Defines Your Life

By Lauren Cheal

Keeping in mind that we are publishers, and not math people, what is an algorithm?

At first glance, the algorithm sounds like a concept out of a particularly frightening chapter of a calculus textbook, but there is no reason to fear the concept. In Kevin Slavin’s TED Talk, “How Algorithms Shape Our World”, he defines algorithms as “basically, the math that computers use to decide stuff”. This simple definition is an easy way to think about the algorithm, but what “stuff” are computers using to make the decisions?

Wikipedia summarizes algorithms as the following:

An algorithm is an effective method expressed as a finite list of well-defined instructions for calculating a function. Starting from an initial state and initial input (perhaps empty),the instructions describe a computation that, when executed, will proceed through a finite number of well-defined successive states, eventually producing “output”and terminating at a final ending state.

To make this a little easier to understand, think of an algorithm as a program that is capable of going through a huge pile of information and making sense of it. The logical output that comes from this process is defined by the user at the beginning, according to the things they need it to do, and the order and way in which it is asked to do those things.

Jeff Hunter (2011) provides this helpful list of what commonly used algorithms do:

  • Searching for a particular data item (or record).
  • Sorting the data. There are many ways to sort data. (Simple sorting, Advanced sorting)
  • Iterating through all the items in a data structure. (Visiting each item in turn so as to display it or perform some other action on these items)

Understanding the basic idea that an algorithm is a function that allows for categorization of large amounts of information, it is easy to see where an algorithm has value in the digital world. Computer information is no more or less than a big pile of data, and algorithms give us shortcuts to processing these enormous data sets.

A basic type of algorithm is a sorting algorithm, which can take a set of data (let’s say, 100 books of different sizes), and sort those books in a specific way (according to their weight, for example). The Computer Science Unplugged website for children gives a good example of how a couple different algorithms could work for a weight sorting task, time- or space-complex task. There are many other types of algorithms commonly used daily, such as the search algorithms that display Google results, and each in their own way is display results or classifications based on how the algorithm works with the data.
Ok, so besides basic sorting tasks, what can algorithms really do?

Searching, Prioritizing, and Providing Biased Content:

Very complex algorithms can accomplish an endlessly diverse number of tasks. The Google search function is one example that we are familiar with, and now that you know what a basic algorithm does, this makes sense. Google search is sorting through an enormous amount of information, and using various functions to come up with the “best” end product—the exact website you were looking for. has a an entire section devoted to understanding the Google search algorithm, including an interesting timeline that tracks all of the changes to that algorithm beginning in 2000 and cataloguing each year’s changes up to the present. While some of the language on that site is technically advanced (and intimidating), it is interesting to see the huge number of changes every year (Seomoz estimates that Google changes its algorithm up to 500-600 times a year), and to guess at what those changes mean for how we receive content.

Another implication of algorithm technology that is related to Google searching is found on Facebook. There, algorithms determine what content appears on a News Feed based on what content the user has previously “liked”. Eli Pariser, president of talks about the danger of this kind of filtering in another TED Talk that is also embedded below. Pariser argues that people have to be careful about letting algorithms decide what news they see based on their likes, because a healthy news diet consists of both the things they instinctively like (chosen with the gut), and the things that could enrich an understanding of the world by pushing people to discover things outside of a current sphere of knowledge. Pariser goes further to say that algorithms are taking the place of traditional news editors (who were human, of course). Where the human editor acts as a gatekeeper and guide to information based on what they know about the audience and what they think the audience needs to know about, the (current, as of his talk) algorithm is only making judgments based on the most superficial, instinctual, and hedonistic of our online habits.

Assessing physical space, stealing jobs from humans:

Algorithms are also used for measuring and accessing physical space by robotic machines. The Roomba vacuum cleaner is able to clean a room because of an algorithm that works out the dimensions of the room and then sends it to each part of the room, systematically. The 60 Minutes feature that is embedded at the end of this paper gives another example of this kind of algorithm. There, robots are programmed to pick up warehouse shelves and bring them to workers at the moment they need to access the materials to pack them. These two examples show that algorithms are capable of computing physical space, and then making an assessment of a complex set of data to make a certain “choice” about a desired outcome.

Algorithms in conflict: crashes, glitches. What happens when this stuff breaks down?

The most frightening thing about algorithms concerns their volatility, and their ability to “speak” to one and other and inform the decisions that another algorithm makes. In “How Algorithms Shape Our World”, Kevin Slavin talks about the potential harm that can be done when these algorithms work outside of human control when he references the Crash of 2:45, or the “Flash Crash” that happened on the U.S. stock market on May 6, 2010. The “black box trading” that algorithms execute, in conjunction with “High Frequency Trading” contributed to the second largest point swing, and the largest point decline, on the Dow Jones Industrial Average in history (Lauricella and McKay, 2010). On how and why this happened, Slavin (2011) explains that in these algorithms, “We’re writing things that we can no longer read. We’ve rendered something illegible”. The term “black box trading” highlights the fact that some of the code works behind a wall of understanding that even the people who wrote the original formula no longer have access to.

When it comes to algorithms that build on each other and change outside of human oversight, the cause for concern is slightly larger. There are many art forms that talk about this potential problem. A lot science fiction literature and pop culture deals with this fear we have about creating things that outpace humanity and then turn on it. Think of Isaac Asimov’s I, Robot, Cory Doctorow’s Down and Out in the Magic Kingdom, The Matrix, and Battlestar Galactica. The fact that these messages pervade popular culture (and have done so for hundreds of years) speaks to an understanding that humans might just be too smart for their own good. And while algorithms represent a truly fascinating and powerful tool, extreme caution is wise when implementing automatic systems, particularly when those systems control finances, social lives, access to information, or any other frightening prospect.

A basic understanding of what an algorithm does, of what it has the potential to do, and of who is controlling the technologies that rely on these equations to make decisions about our lives gives us the power to ask critical questions and make sure that humans are in control of the technologies created.

At least, that is the hope.

There are a number of fun web videos on the topic of Algorithms. Please enjoy those below, and feel free to share others you know about in the comments section here.

The TED Talk that started it all: How Algorithms Shape our World by Kevin Slavin.

60 Minutes Feature: “Are Robots Hurting Job Growth?”. Alarmist title aside, this is a video that shows some very cool uses of algorithms in the manufacturing and production sectors in the United States.

Eli Pariser’s TED Talk: Beware Online “Filter Bubbles”

Another TED Talk, this time about the algorithmic editing of the web and how that editing function affects the content we see online, and thus the reality of the internet we experience. Because these algorithms are now our editors, Pariser argues that we need to make the algorithms focus on a balanced news diet, including some junk food and some of vegetables.

At PopTech 2012 Jer Thorp gave a presentation on Big Data. This is a visually gorgeous look at different types of data being displayed in very interesting ways.

Thorpe looks at how that data trails (think of yourself as a data slug that leaves behind a trace of everything you do in an electronic form) we leave can be examined, visualized, and ultimately understood. He also breaks down the “architecture of discussion” by mapping Twitter conversations that happen around a New York Times article.

He also warns that data is the new oil, and that the fragmented microorganisms that compose oil are not dissimilar to the fragmented pieces of our souls that make up public data.

And finally, from The Onion: Are We Giving The Robots That Run Our Society Too Much Power? This is just one of my favourite robot-related videos of all time. My apologies, it doesn’t have embedding code, but it is worth clicking the link.,14200/

Sources Consulted

Danielsson, Lorenzo, E. “Understanding Algorithms for Novice Programmers,” Keeping it Small and Simple. May 20, 2007. Retrieved January 13, 2013.

“Google Algorithm Change History”. Retrieved January 17, 2013.

Green, Scott A., Mark Billinghurst, XiaoQi Chen, and G. J. Chase. “Human-robot collaboration: A literature review and augmented reality approach in design.” (2008).

Hunter, Jeff. “Introduction to Data Structures and Algorithms.” December 28, 2011. Retrieved January 13, 2013.

Lauricella, Tom, and McKay, Peter A. “Dow Takes a Harrowing 1,010.14-Point Trip,” Online Wall Street Journal, May 7, 2010. Retrieved January 15, 2013.

Pariser, Eli. The Filter Bubble: What the Internet Is Hiding from You, Penguin Press (New York). (2011).

Shirky, Clay. A Speculative Post on the Idea of Algorithmic Authority. November 15, 2009. Retrieved January 16, 2013.

Tan, Pan-Ning, Steinbach, Michael, and Kumar, Vipin. Introduction to Data Mining, Addison-Wesley (2006).


  1. Thanks, Lauren, for compiling all the reasons I’m mildly terrified of the internet into one concise paper.

    But, to be serious, the attitude that personalized content derived from algorithms is this beacon of light that will target consumers and encourage (read: brainwash) them into buying, or subscribing, or following, or whatever and thus make producers rich and buyers happy has alarming consequences. For publishers who are using information from e-readers to make decisions on the types of books to publish, there is a very real risk that we’ll lose the diversity that makes literature so important to society. Eli Pariser’s point that we need a variety of news in order to get a balanced diet and make independent decisions about that content couldn’t be more true. Translated to literature, his point is that readers should be exposed to a variety of books, not just the same genre or authors they already know they like. If algorithms are taken to the extreme, publishers will finally have the perfect excuse to stop publishing challenging, subversive, or difficult content. Push, by Sapphire, would never have seen the light of day if algorithms had anything to say about it, and Fifty Shades of Grey would be on permanent reprint. I will concede that algorithms can be helpful in understanding buyer behavior and tailoring systems to be more efficient and helpful. But being helpful should stop short of force-feeding content based on past user activities (which may or may not be indicative of what a user likes or does not like).

    The idea of algorithms running amok in the stock market (among other places) only makes me more convinced that contrary to all good marketing sense, I should continue to limit my online presence and be incredibly wary of sites’ tracking activities. J.K. Rowling said it best: “Never trust anything that can think for itself if you can’t see where it keeps its brain.”

    On the upside, from your paper and the embedded videos, I can safely assume that I’m doing an impeccable job perplexing Facebook for there is rarely, if ever, anything of interest in my newsfeed. I attribute that to my calculated plan to never “like” pages if I can help it, and only write private messages. That, or I have extremely boring friends who do not like any of the things that I am interested in.

    Despite algorithms giving me the heebie-geebies, I appreciate your thorough and easy to follow explanation of what algorithms are and how they work.


    1. Seriously good point made by J.K. Rowling and reiterated by you here, KC. I am aways drawn to humour about the robots that run our society, and I think that tendency comes from an inherent unease about “writing things we can’t read”. Certainly many people who write are searching for an immortality made possible by the reproduction of their thoughts for an infinite amount of time, but I wonder if the programmers who write code that can communicate with other code and create new entities share a similar desire for immortality, or understand the frightening potential of those projects.

      I see your Potter reference and raise you a Jurassic Park:

      “Your scientists were so preoccupied with whether or not they could, they didn’t stop to think if they should.” – Michael Crichton via Dr. Ian Malcolm

  2. I should also mention that with 500-600 Google algorithm updates a year, the search engine has its own opportunity to run amock in the same way Slavin identifies in the challenges of the Wall St algorithms. Sometimes we built things that get so complex we know longer know exactly what they’re doing. There have been a few known times when Google has reverted a change due to unexpected consequences in how the search results display post algo update.

Comments are Closed.