Timeout errors due to DB pool exhaustion while creating release from UI having lots of packages

While creating a release from UI in Octo for a project which has 29 different packages to deploy, our DB connection pool got exhausted and we started seeing following error:

“Timeout expired. The timeout period elapsed prior to obtaining a connection from the pool. This may have occurred because all pooled connections were in use and max pool size was reached.”

Another part of logs:
ERROR There are a high number of transactions active. The below information may help the Octopus team diagnose the problem:
Now: 2021-06-02T00:15:59

there were 98 such queries listed in the logs, even though we had only 29 packages in the project for single release. We get these timeouts only when we try to create the release from UI, as it tries to get details about all packages required for the release.

I understand we can increase Pool limit to avoid such issue, but I was hoping to have some logic from app itself to not initiate these many request or basically fine tune the logic with some batch/paging process.

Master server details:
Octo version: 2020.2.15
Instance type: C5.2xlarge
number of nodes in cluster: 1
task cap: 50
DB connection pool limit: 200

Also if there is any guideline/relation between task cap and DB connection pool limit, I would like to go through that.

Thanks and Regards,

Hi Devan,

Thanks for getting in touch!

My recommendation would be to look into upgrading to the latest version when possible. Over the last year or so I have seen several SQL connection pool issues come up and get fixed.

Also, are you using the Jira or Azure Dev Ops issue trackers at all? I’m wondering if Octopus is having to retrieve work items for a large number of packages at once whether this could be causing an issue.


Hi Paul,

usually we don’t face DB connection pool issue, only when we are creating release from UI for a certain project with 29 packages per release.
I just realised logs in my original post missed the query which caused the DB connectionsexhaustion.
here it is:

Transaction ‘PackageVersionSearchAction’ Open with 1 commands started at 2021-06-02T00:12:51 (188.26 seconds ago)
2021-06-02T00:12:51 SELECT TOP 1 *
FROM dbo.[Feed]
WHERE ([Id] = @id)

before release creation call, number of active DB connections shoots from 85 to 269. There were 98 occurrences highlighted as “There are a high number of transactions active. The below information may help the Octopus team diagnose the problem”


Thanks for the additional information.

As mentioned, the first step would be to upgrade to the latest version and see if the issue persists.

If it does please let me know and I’ll be able to bring this to the attention of our engineers and see if we’re able to replicate and improve the query in some way.

Hi Paul,

we have seen this behaviour in 2019.6 version as well. We are already planning to upgrade at end of this month, but can we get this behaviour checked at your end. Ideally for 29 packages, there shouldn’t be these many calls to DB. I have already checked release notes and I don’t see anything specific related to this behaviour. Would appreciate if we can get this checked as version upgrade is going to take some time.


I attempted to replicate this earlier on our latest version using a project with over 30 different packages without success.
The release always creates instantly for me.

Were you able to check number of calls to database for the query I provided, when you created the release. Also whats your DB pool limit set ? I can try with that.


This test VM is using the default SQL settings, so, I believe that will be a pool limit of 100.
I don’t see the number of connections increase when creating a release.

Hi Paul,

We are planning to upgrade Octo on latest version next month. Meanwhile, working on upgrade in staging instance and replicate the behaviour. Will let know if it is coming in staging with latest version.


1 Like

This topic was automatically closed 31 days after the last reply. New replies are no longer allowed.