Data import should not get stuck in duplicates #10125

Closed
opened 2025-12-29 21:27:10 +01:00 by adam · 2 comments
Owner

Originally created by @Xemanth on GitHub (Aug 21, 2024).

Deployment Type

Self-hosted

NetBox Version

v4.0.8

Python Version

3.12

Steps to Reproduce

When you import data, such as devices, into Netbox using a CSV file, the data import get stuck on unique records that already exist in the database.

Expected Behavior

Data import should not get stuck on duplicate records. The UI should provide an option to bypass existing records or proceed with importing the remaining lines while notifying the user afterward about the skipped records. Perhaps a checkbox on the import page could allow this type of line skipping so original function would stay as it is for those who like to do more work.

Observed Behavior

The UI briefly displays an error about the duplicate record before returning to the data import front page. The admin must then manually remove the line that the Netbox UI flagged, and then rerun the import to check for any additional duplicate lines. This takes a lot of time when there is a lot of data to be imported.

Originally created by @Xemanth on GitHub (Aug 21, 2024). ### Deployment Type Self-hosted ### NetBox Version v4.0.8 ### Python Version 3.12 ### Steps to Reproduce When you import data, such as devices, into Netbox using a CSV file, the data import get stuck on unique records that already exist in the database. ### Expected Behavior Data import should not get stuck on duplicate records. The UI should provide an option to bypass existing records or proceed with importing the remaining lines while notifying the user afterward about the skipped records. Perhaps a checkbox on the import page could allow this type of line skipping so original function would stay as it is for those who like to do more work. ### Observed Behavior The UI briefly displays an error about the duplicate record before returning to the data import front page. The admin must then manually remove the line that the Netbox UI flagged, and then rerun the import to check for any additional duplicate lines. This takes a lot of time when there is a lot of data to be imported.
adam closed this issue 2025-12-29 21:27:10 +01:00
Author
Owner

@jeremystretch commented on GitHub (Aug 21, 2024):

This is working as intended. Bulk imports are all-or-none, ensuring that individual errors do not result in partial operations that are difficult to resolve.

@jeremystretch commented on GitHub (Aug 21, 2024): This is working as intended. Bulk imports are all-or-none, ensuring that individual errors do not result in partial operations that are difficult to resolve.
Author
Owner

@Xemanth commented on GitHub (Aug 21, 2024):

If I have like 1000 lines in the csv, it's quite big task to find all the duplicates from the material as netbox doesn't tell them all in the same import... requires like dozen imports depending how many duplicates there are. There should be easier way to import all in no time.

@Xemanth commented on GitHub (Aug 21, 2024): If I have like 1000 lines in the csv, it's quite big task to find all the duplicates from the material as netbox doesn't tell them all in the same import... requires like dozen imports depending how many duplicates there are. There should be easier way to import all in no time.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: starred/netbox#10125