Pooling Part 1

Pooling Part 1: Optimize your game’s performance

Why pool?

Creating and destroying GameObjects costs memory & processor power. If the GC kicks in to clear unused objects from memory your application might stutter a moment. To avoid these hiccups you can use a technique called Pooling.

The idea

The idea behind pooling is to create a bunch of objects (e.g. GameObjects) at application start or after a level has been loaded and keep them stored in a list. If you need a certain GameObject (e.g. a bullet) you pull it from the pool, work with it and once you’re done, you put it back. By reusing your objects this way you avoid the allocation and deallocation and the nasty hiccups that can ruin your gameplay experience.


Basically a pool is a class with a list of objects and methods to request and release objects. One way to implement a pool class is to search a list every time an object is requested. That can be unnecessarily slow. A better approach is to keep only the unused items in the list and kick out the others. For this the C# Stack class comes in handy. It uses Push() & Pop() methods to alter the list and follows the Last-In-First-Out (LIFO) system. So if we push items to the stack, we get them back in the reversed order.

Below you find the code for an implementation of a stack-based pool. It is a lazy version of a singleton class and you can request pooled GameObjects with Pool.Instance.Pop() and release them to the pool by calling Pool.Instance.Push(myGameObject). If a GameObject is pushed into the pool it is deactivated to avoid being visible and to stop it from doing things.

You can drop the script on an empty GameObject in your scene and then set the pool-size as well as the pooled prefab in the inspector.

In the next part of the Pooling series I demonstrate the usage of this class in a demo project that you can download.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.