The queue concept in Qstash allows ordered delivery (FIFO) and also controlled parallelism. Here we list common use cases for Queue and how to use them. See the API doc for the full list of related Rest APIs.

Ordered Delivery

With Queues, the ordered delivery is guaranteed by default. This means:

  • Your messages will be queued without blocking the REST API and sent one by one in FIFO order. Queued means CREATED event will be logged.
  • The next message will wait for retries of the current one if it can not be delivered because your endpoint returns non-2xx code. In other words, the next message will be ACTIVE only after the last message is either DELIVERED or FAILED.
  • Next message will wait for callbacks or failure callbacks to finish.
curl -XPOST -H 'Authorization: Bearer XXX' \
            -H "Content-type: application/json" \
  'https://qstash.upstash.io/v2/enqueue/my-queue/https://example.com' -d '{"message":"Hello, World!"}'

Controlled Parallelism

To ensure that your endpoint is not overwhelmed and also you want more than one-by-one delivery for better throughput, you can achieve controlled parallelism with queues.

By default, queues have parallelism 1. Depending on your plan, you can configure the parallelism of your queues as follows:

curl -XPOST https://qstash.upstash.io/v2/queues/ \
  -H "Authorization: Bearer <token>" \
  -H "Content-Type: application/json" \
  -d '{
    "queueName": "my-queue", 
    "parallelism": 5,
  }'

After that, you can use the enqueue path to send your messages.

curl -XPOST -H 'Authorization: Bearer XXX' \ 
            -H "Content-type: application/json" \
  'https://qstash.upstash.io/v2/enqueue/my-queue/https://example.com' -d '{"message":"Hello, World!"}'

You can check the parallelism of your queues with the following API:

curl https://qstash.upstash.io/v2/queues/my-queue \
  -H "Authorization: Bearer <token>"