The Big-O notation is the way programmers think about how fast their programs are, or how much memory they need. C’mon, really? Do we need *Big-O* to measure time? If there's a difference between one program and another in how they compute something, or react to a button pressed in 0.1 seconds versus 0.2 seconds, then talking about seconds can be the proper language. What if the difference is that one program runs in *some *seconds, and the other in *some *thousands of years? It’s not important how much is *"some,"* but rather that we talk about different orders of magnitude. This is where the Big-O notation comes in as a proper language to talk about orders of magnitude.

When your program runs on small-scale data, it’s less important how fast it runs because it’s always a fraction of a second. A problem arises when the data set grows — it will run slower, but how much slower?

### Defining the Big-O notation

Have you ever used an app that initially was awesome, but as you used it and entered more data, it started lagging and working slower until it was barely usable at all? If yes, you have experienced too large Big-O running time of this app.

In short, saying that time, memory, money or anything else is *O(X)* means that it’s somewhat proportional to *X*, but we don’t care about the exact ratio. Let me share a relevant story …

### Codility birthday cakes

Most of my experience as a developer comes from working at Codility. I joined the company when there were eight of us, and now there is over a hundred employees. We had a tradition that every time someone brought in a cake for a birthday, it was shared with all of us. Initially, even a small cake was enough (even to have seconds). As we grew in size, people started to bring in more cake, and each person was limited to one small piece.

It became clear that this tradition was not sustainable. Why not? Can’t we agree on having smaller pieces and buying more cakes? This is because sooner or later, there will be more cakes than we could eat, even if we didn’t eat anything else at all!

Let’s say there are *N* people in Codility. The tradition we want to sustain is that once a year, the person that has a birthday provides a piece of cake to each employee. A piece of cake weighs at least 80 grams. For simplicity, let’s skip weekends and assume that there are 200 working days a year. If someone has a birthday during the weekend, they celebrate it on the previous or following working day.

This table shows how the average amount of birthday cake eaten by one person grows with the number of people in Codility:

And as to the Big-O, that’s basically all there is to it. It’s just saying that something is proportional to some expression depending on *N*, but the exact ratio is not relevant.

You're probably wondering why we wouldn't just have smaller pieces of cake? This is a temporary fix, but it wouldn’t change the fact that this birthday cake tradition is not sustainable as the company grows. It’s because the amount of cake you get every day (on average) is **proportional** to *N — *it's *O(N)*. We don’t need to calculate exact numbers to know that it cannot be healthy.

Eager to put your knowledge to the test? Visit our Programmer's Home and take this lesson about Big-O notation and time complexity.

*This blog originally appeared on Cubical Thoughts.*