[Django Tip] Gunicorn Async Workers Showcase

A showcase of Gunicorn async workers with clear example.

Lu-Hsuan Chen
2 min readDec 2, 2020
Photo by Dominik Bednarz on Unsplash

Scenario

Recently, I have been working on a project using Django as an API server, and there is one special API which generates a file with arbitrary generating time and size. When calling this API, I always got a 502 error. In the beginning, I thought it might be the problem of timeout setting, so I changed the timeout settings of both Django and Apache (using as reverse proxy) to 300 seconds. However, I still received 502 error.

Then I started to investigate the cause and realized the meaning of worker setting of Gunicorn. After I altered the worker type from sync to asynchronous worker, this particular API call acts normally not only its responses time but also its execution speed.

In order to let both myself and others to realize the meaning of Gunicorn workers, I wrote this post as a reminder.

Types of Gunicorn workers

Gunicorn uses pre-fork worker model, which means it uses a central master process to manage a set of worker processes. By default, Gunicorn uses synchronous ( sync) worker, but sync worker does not support persistence connections, which means you shouldn't do anything that takes vary amount of time just like this case. [1]

Suggest in Gunicorn [1], you can do one of the following solutions:

  1. a buffering proxy
  2. asynchronous workers

For ease I chose asynchronous workers, and this solved the problem.

Conclusion

If your application will make a long blocking calls, e.g., external web services, you can consider switch to async workers.

Reference

[1] https://docs.gunicorn.org/en/stable/design.html#choosing-a-worker-type

Originally published at https://cuda-chen.github.io on December 2, 2020.

If you have any thoughts and questions to share, please contact me at clh960524[at]gmail.com. Also, you can check my GitHub repositories for other works. If you are, like me passionate about machine learning, image processing and parallel computing, feel free to add me on LinkedIn.

--

--

Lu-Hsuan Chen

Enthusiastic of image processing, machine learning, and parallel computing. Current status: beggar on the street.