Redis maxclients incorrectly configured

Hi, my Sidekiq service was failing with Redis::CommandError (ERR max number of clients reached), so I ran INFO via the Redis CLI and it shows maxclients:25.

# Clients
connected_clients:21
cluster_connections:0
maxclients:25
client_recent_max_input_buffer:51
client_recent_max_output_buffer:0
blocked_clients:10
tracking_clients:0
clients_in_timeout_table:10

The pricing page indicates that my free Redis service should have a max of 50 connections. Are the “max number of clients” the same thing as Render’s “connection limit”? Should the maxclients be 50?

I am trying to understand the connections and clients for my Redis instance so that I may better configure Sidekiq.

Hi Aaron,

Thanks for reaching out.

I’m not seeing the same on a newly provisioned Redis free plan in my personal account.

It does seem we increased the connection limits just before Redis on Render was made generally available, was your Redis instance created during early access?

Thanks

Alan

Hmm, that might be the ticket! I created a new Redis instance and it indeed does have a different output, i.e. maxclients:50 :tada:

# Clients
connected_clients:13
cluster_connections:0
maxclients:50
client_recent_max_input_buffer:51
client_recent_max_output_buffer:0
blocked_clients:10
tracking_clients:0
clients_in_timeout_table:10

I’m still curious, does each call to Redis from either a Rails web service or Sidekiq background worker establish a new connection? Or is there any detailed Render-Redis documentation that could help me understand how it works (beyond what is publicly available)? I thought that my Rails app was a single client/connection, and the Sidekiq was another, and my local database client connecting externally was another.

I can push some code to test this theory in the meantime. I think I’ll still need to update my Sidekiq configuration to do some connection pooling. Maybe even my Rails app, which uses the new Kredis gem in several spots. Actually, since my Rails app uses Kredis more as a cache, and Sidekiq uses Redis as a queue, it sounds like I’ll need two Redis instance, each with a different eviction policy.

I’m not too familiar on the inner workings of Sidekiq, although it seems it has connection pooling/concurrency settings available/built in: https://github.com/mperham/sidekiq/wiki/Advanced-Options#connection-pooling

Getting into the weeds, the code describes the Redis connection use here (some only seem related to the paid Sidekiq versions): https://github.com/mperham/sidekiq/blob/main/lib/sidekiq/redis_connection.rb#L41

And it looks like it will warn if you set you pool too small for your configured concurrency:

def verify_sizing(size, concurrency) raise ArgumentError, "Your Redis connection pool is too small for Sidekiq. Your pool has #{size} connections but must have at least #{concurrency + 2}" if size < (concurrency + 2)end

So concurrency + 2 appears to be a bare minimum.

Load testing an app would be one way to highlight any connection limit issues.

Hope that helps

Alan

1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.