The arithmetics of throughput-based games

Back in the days when we were making games, I loved trying to more rigorously quantify the economy of a game, and how its core mechanics evolve numerically. Around 2010, a very popular game that existed in many variations was a type of game in which you had to manage a pipeline, for example by building a cafe (or an airport, or a hospital, or a zoo… core game mechanics were similar across all these themes) and serving patrons. In, say, the cafe version of this game, patrons would show up at a certain rate, would wait for a certain period of time for your cafe to have a free table to serve them, would spend some time sitting at the table, and would leave, and either increase or decrease your cafe’s “buzz”, depending solely on whether your cafe was able to serve them in time. An increased “buzz” (by serving more patrons) increased the rate at which they show up (and a decreased “buzz” would reduce the rate), so the game would reach interesting equilibria at which your cafe’s capacity to serve patrons would equal the “buzz”-driven demand driving the rate of new patrons’ showing up. As a player, your levers to serve more patrons consisted of hiring more waiters, hiring more chefs, placing more tables, etc. Many versions of this mechanic are still around in new versions in today’s mobile games. Our game back in the days was called Cafe Mania (and I think something like 5-10% of all Brazilian internet users played it at some point). It’s so simple and elegant, I recently remembered it, so I dug out my old quantification of the game’s economy. Here it is.

The core model of the game

At its core, the game is a pipeline: customers are put into the pipeline, they are either served or not, and they leave the pipeline. That means that only two things matter: the pipeline’s input load and the pipeline’s capacity. A pipeline for which its input load is larger than its capacity will be fully utilized and start queuing people up. A pipeline for which its input load is lower than its capacity will be underutilized and have free capacity, but nobody will queue up. For example, imagine literally a tunnel with a capacity of 5 people per minute who can pass through. If the tunnel’s input load is larger than 5 people per minute, i.e., 6 people are arriving at the tunnel every minute, people will start queuing up. (The time people wait to get into the tunnel doesn’t matter, by the way! If a pipeline’s capacity is exhausted, the wait time for customers will go to infinity.)

In Cafe Mania, these characteristics of a cafe determine its attributes as a pipeline:

  • The input load is determined by the buzz.
  • The capacity is actually broken into two distinct pieces:
    • The capacity to serve customers when a dish is available,
    • The capacity to cook dishes for customers.

That’s it. Everything else in the cafe somehow plays into these two characteristics. However, the complexity in Cafe Mania’s case arises from the fact that the input load self-regulates. That’s the whole point about the buzz: it goes up and down, and when it goes up and down, the cafe’s input load changes. The key to understanding this process is that it will eventually reach equilibrium: each cafe, with its stoves, counters, waiters, tables, dishes and layout, will fairly quickly reach a state in which the buzz actually doesn’t move anymore. Which buzz is that? It only depends on how we calculate the buzz. Let’s say we do this:

  • When a customer leaves the cafe happy, we add +0.1 to the buzz.
  • When a customer leaves the cafe unhappy, we subtract buzz punishment factor x 0.1 from the buzz.

What we call buzz punishment factor here is the amount of happy customers you need to serve in order to equalize one unhappy customer. Say we set this punishment factor to 2. That means that if we have 2 happy customers for each unhappy customer, we will add # happy customers x 0.1 = 2 x 0.1 = 0.2 to the buzz, and we will subtract # unhappy customers x buzz punishment factor x 0.1 = 1 x 2 x -0.1 = -0.2 from the buzz. Of course, 0.2 – 0.2 = 0. So if our cafe produces 2 happy customers for each unhappy customer, it will be in equilibrium. In other words, its buzz won’t change. We can express that in terms of a serving ratio: if 2 customers are happy and 1 is unhappy, then 2 have been served and 1 hasn’t been served. So our cafe has a serving ratio = 2 / (2 + 1) = 67%. 67% of customers walking in to our restaurant will get served.

This is the trick: note that this equilibrium serving ratio doesn’t depend on anything but the buzz punishment factor. It’s not dependent on our cafe’s layout, the number of waiters or anything else. It is the serving ratio that any cafe must eventually converge towards. This convergence will happen fairly quickly. This equilibrium serving ratio is the link that connects all elements in the game. Imagine you start your cafe, and at the initial buzz level of buzz = 5.0 we send you 5 customers per minute. Your cafe is set up in a way, however, that it can actually serve more customers than that. (We’ll see later how we can calculate that.) Say that it can serve 10 customers per minute. This means that every customer walking into the cafe will be served. Of course, that means your cafe’s serving ratio is 100%. This is above the game’s equilibrium serving ratio (which, remember, we forced to 67% by choosing the buzz punishment factor := 2!). You have happy customers leaving the restaurant, but not a single unhappy customer leaves the restaurant. That means your buzz will increase +0.1 for each customer leaving the restaurant. What will it increase to? As the buzz increases, we will send you more and more customers per minute. (We will increase the cafe’s input load.) At 10 customers per minute, your cafe will still be serving at a serving ratio of 100%. But at 15 customers per minute, 5 customers will be leaving the cafe unhappy for each 10 who are leaving happy. That is just the equilibrium serving rate of 67% = 10 served / (10 served + 5 not served). So that’s the maximum rate we will be eventually sending to your cafe. (In the game, that corresponds to a buzz rating of 45, by the way, but that’s really a irrelevant number, as we should just think in terms of frequencies, or customers per minute.)

To repeat:

  • We choose a buzz punishment ratio for the game.
  • This punishment ratio determines the game’s equilibrium serving ratio.
  • Each cafe will self-regulate such that its buzz will be just so that its serving ratio converges towards the equilibrium serving ratio.
  • The buzz which the cafe will reach is determined by the cafe’s capacity. At the cafe’s equilibrium buzz, the cafe’s capacity is equal to the equilibrium serving ratio x the input load we’re putting onto the cafe.
  • A cafe’s capacity is determined by its layout, its dishes, the number of waiters, the number of tables, and the time it takes a customer to eat. It is not influenced by anything else. (For example, it is NOT influenced by the customer wait time.)

In the example above, we simply assumed that our cafe’s capacity is 10 customers per minute. This figure is decisive for the buzz towards which a cafe will converge. It is influenced by two things: 1) the cafe’s capacity to cook and 2) the cafe’s capacity to serve. We can view these two as different pipelines that the customers are flowing through sequentially. In two sequential pipelines, the capacity of the smaller pipeline determines the capacity of the overall pipeline. That’s what happens here, too.

The cafe’s serving capacity

Assume that we have an infinite number of dishes ready to serve. Then the cafe’s capacity to serve is determined only by how many customers we can seat, feed, and get out of the restaurant before the next customer comes. In the case of just one table, this capacity is simple to calculate: say it takes a customer 2 minutes to eat. Then the cafe’s capacity is 1 / 2 minutes = 0.5 customers per minute. In the case of multiple tables, simply multiply this capacity with the number of tables available. A cafe with 2 tables and 2 minutes serving time per customer has a capacity of 2 x 1 / 2 minutes = 1 customer per minute. That’s easy to imagine: in minute 0, a customer shows up and sits down. In minute 1, another customer shows up, but the original customer is still eating for another minute. So the new customer sits down at the second table. A minute later, the first customer gets up and walks out. In that very second, the next customer shows up, and he sits down at the now-empty first table. And so on.

To repeat – a cafe’s serving capacity is:
Maximum service frequency = number of tables x 1 / total customer service time

The total customer service time is the time from when the customer walks in, sits down, gets served, eats and gets up. So we can write it as:

Total customer service time = time to serve + time to eat

Time to eat here is a constant that we configure in the game. Time to serve is more complicated: it is determined by the waiter’s average path length. And it’s determined by the number of waiters. Simplified, we can say:

Total customer service time = time to serve an average customer / number of waiters + time to eat

Substituting this in above, we get:
Maximum service frequency = number of tables x 1 / (time to serve an average customer / number of waiters + time to eat)

The cafe’s cooking capacity

The other part of the cafe’s pipeline is the cooking. The serving time of a dish determines the serving ratio I can attain with that dish. It’s simple:

Dish serving ratio = time to consume the dish / time to cook the dish

Imagine it takes 10 minutes to cook a dish, but only 5 minutes to consume a dish. Half the time my cafe will be out of dishes. In other words, dish serving ratio = 5 / 10 = 50%. 50% of customers will leave unhappy. But what does the time to consume a dish depend on? It depends on the frequency with which customers are eating it, of course. Which brings us back to the buzz. Because:

Dish serving ratio = (# servings / customer frequency) / time to cook the dish

Remember that if our cafe’s pipeline capacity is below 67% (below the game’s equilibrium serving ratio), something will change: the buzz will go down. In this case, and if we are only cooking this dish, the buzz will regulate down such that the customer frequency is just so that the dish’s time to consume it fully will rise to 6.7 minutes. Because then:

Dish serving ratio = time to consume the dish / time to cook the dish = 6.7 minutes / 10 minutes = 67%

If this dish gives me 10 servings:

Servings / customer frequency = 6.7 minutes => customer frequency = 10 servings / 6.7 minutes = 1.3 customers per minute

The buzz will self-regulate such that we will send 1.3 customers per minute to the restaurant.

Putting it all together

We have two pipelines in sequence in the cafe: the cooking pipeline and the serving pipeline. Each pipeline’s capacity can be calculated. The cafe’s overall capacity is the minimum of the two (because they are sequential). The cafe’s buzz will self-regulate down or up such that the frequency at which customers are sent to the cafe is equal to the cafe’s overall capacity.

Max customer frequency based on cooking capacity = (1 / equilibrium serving ratio) x # total servings / time to cook = (1 / equilibrium serving ratio) x # servings x # stoves / time to cook
Max customer frequency based on serving capacity = (1 / equilibrium serving ratio) x number of tables x 1 / total time to serve = (1 / equilibrium serving ratio) x number of tables x 1 / (customer eat time + time to serve per waiter / # waiters)

The cafe’s max customer frequency is:
Max customer frequency = min (max customer frequency based on cooking capacity, max customer frequency based on serving capacity)

Equilibrium serving ratio = Buzz punishment ratio / (Buzz punishment ratio + 1)

I’m Mario

I am the co-founder and CEO of Oscar Health. Previously, I was the chief scientist of what was at one point the largest social gaming company in Latin America. I used to work as a senior investment associate at Bridgewater Associates, the largest global macro hedge fund in the world. Before that, I worked at McKinsey & Company. I also once was a visiting scholar in computer science at Stanford University. I wrote a whole bunch of publications. Among them was a paper on an algorithm we called Eigentrust, which enables a convergent, distributed calculation of trust scores in dynamic networks. It is among the most-cited computer science papers since it came out in 2003.

Short trips through the complex plane

The other day I got obsessed with trying to find the smallest possible code to create the most interesting visual complexity. Fractals and in particular the Mandelbrot set come to mind quickly, so I wanted to visualize the Mandelbrot set, and I wanted to do it in Javascript in such a way that I can just paste it into a browser’s URL bar.

The Mandelbrot set is defined by an iterative equation in the complex plane: you calculate the following equation until its absolute value is larger than 4, or until some predefined number of iterations are done and the value hasn’t converged. You use the number of iterations in each point as the value at those particular coordinates.


So the real part becomes


And the imaginary part is


Now you just traverse x and y of a canvas, use those as coordinates cr and ci in the equation above, and iterate until the absolute value of the complex number is larger than 4, or you exceed some maximum amount of iterations.

Now you confront the question of how to visualize the set: you have an iteration count for each coordinate in your image, but how do you draw that? You can do all kinds of color tricks (for example, RGB = {# iterations, # iterations + 128, # iterations + 64} looks nice), but I thought it would look nice to just show contours. So I’m calculating and plotting the binary difference in r- and i-direction in each image (using just one array for that in the code). In the code, that’s variable p.

I wanted to have something dynamic and interesting to watch though, and this just creates a static image. So instead, let’s successively zoom into the Mandelbrot set. The Mandelbrot set is a fractal, so it’s self-similar and infinite. You can just pick a smaller and smaller range for variables cr and ci in floating point, and each range will produce results. So the first step is easy: pick a new starting coordinate for each new Mandelbrot image by multiplying the previous coordinate with a constant scale factor (and do some corrections to keep the image centered in your canvas). That’s the second r0 and i0 update block in the code. s is the current scale that gets multiplied with the same factor in each image iteration (I chose 0.98, decrease it to make it zoom more quickly).

The problem is now that this visualization can quickly become boring, if you’re looking at the wrong place in the complex plane where there’s nothing much going on. So I want to modify the starting coordinates for each frame in each image iteration in such a way that the algorithm is “looking for” the greatest complexity in the image. We do that by calculating the center of gravity of black pixels in each image, essentially as the simple sum of all vectors pointing from the center of the frame to each pixel (variables wx and wy). In each image iteration, we now want the focus to move away from the center of gravity if there is an overhang of black pixels, or towards the center of gravity if not. That’s determined by a running sum of black (1) and white (-1) pixels in variable w. Then you can calculate each image iteration’s movement in the x- and y-direction as the product of the signs of variables wx and w or wy and w, respectively.

The more you zoom in, the longer you need to iterate in the complex equation above to find any kind of interesting “information”. Otherwise you’ll soon find yourself in just black or white. So I’m solving that in an incredibly simple way: if the density of black pixels falls below 20% of the overall image, increase the iteration range m by 1. That means the zoom operation will never run out of interesting stuff. (I guess the floating point range could run out? I don’t know how Javascript reflects those, to be honest. I thought I could replace floating point variables with integers and moving the fixed point, but it’s not THAT trivial, although I think there is an approximation that could be found somewhere in the fact that we put a limit of 4 on the absolute value, but anyway.)

Finally, this can get pretty slow, particularly further “down” in the Mandelbrot set. So I want to downsample the whole thing, and only show every b-th pixel, and then blow those out in the image. So the code does exactly that, skips to each b-th pixel and then draws b pixels in x- and y-direction around it. It seems to me that b=2 works well on a laptop, and b>=3 on a mobile browser.

Randomize the starting point within a small predefined range, and because the Mandelbrot set is chaotic (i.e., starting conditions extremely quickly lead to unpredictability in the set), the journey will look extremely different each time.

Here is the code: