Implement django-rq retry mechanism #7936

Closed
opened 2025-12-29 20:30:14 +01:00 by adam · 2 comments
Owner

Originally created by @abhi1693 on GitHub (Apr 22, 2023).

Originally assigned to: @abhi1693 on GitHub.

NetBox version

v3.4.8

Feature type

New functionality

Proposed functionality

Link: https://python-rq.org/docs/exceptions/

It would be best if the jobs can have exponential retries that finally fails after 3 days. To have it retry for that long and still fail, would be on the users to figure out how to get the data that they missed.

Use case

When I create webhooks, there are times when the receiver is down and then the webhooks are missed. When the receiver comes online, it does not receive the changes. To overcome this, we have to build extra workarounds to find the data that got missed during the downtime and the webhook trigger again.

With an exponential retry mechanism, such issues can be avoided to a lot of extent.

Database changes

No response

External dependencies

No response

Originally created by @abhi1693 on GitHub (Apr 22, 2023). Originally assigned to: @abhi1693 on GitHub. ### NetBox version v3.4.8 ### Feature type New functionality ### Proposed functionality Link: https://python-rq.org/docs/exceptions/ It would be best if the jobs can have exponential retries that finally fails after 3 days. To have it retry for that long and still fail, would be on the users to figure out how to get the data that they missed. ### Use case When I create webhooks, there are times when the receiver is down and then the webhooks are missed. When the receiver comes online, it does not receive the changes. To overcome this, we have to build extra workarounds to find the data that got missed during the downtime and the webhook trigger again. With an exponential retry mechanism, such issues can be avoided to a lot of extent. ### Database changes _No response_ ### External dependencies _No response_
adam added the status: acceptedtype: feature labels 2025-12-29 20:30:14 +01:00
adam closed this issue 2025-12-29 20:30:15 +01:00
Author
Owner

@abhi1693 commented on GitHub (May 8, 2023):

@jeremystretch Is this alright, if I introduce new configuration that manages the retry related configurations? I think that would provide the most user friendly solution.

@abhi1693 commented on GitHub (May 8, 2023): @jeremystretch Is this alright, if I introduce new configuration that manages the retry related configurations? I think that would provide the most user friendly solution.
Author
Owner

@jeremystretch commented on GitHub (May 10, 2023):

@abhi1693 without digging into this myself it's hard to know what would be needed, but I'm not opposed to it.

@jeremystretch commented on GitHub (May 10, 2023): @abhi1693 without digging into this myself it's hard to know what would be needed, but I'm not opposed to it.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: starred/netbox#7936