Allow users to define and run custom reports to validate data within NetBox #1245

Closed
opened 2025-12-29 16:30:33 +01:00 by adam · 3 comments
Owner

Originally created by @jeremystretch on GitHub (Sep 19, 2017).

Issue type

[x] Feature request
[ ] Bug report
[ ] Documentation

Environment

  • Python version: 3.4.3
  • NetBox version: 2.1.4

Description

NetBox is intended to serve as the "source of truth" for a network, acting as the authoritative source for IP addressing, interface connections, and so on. To help guarantee the integrity of data within NetBox, I'd like to establish a mechanism by which users can write and run custom reports to inspect NetBox objects and alert on any deviations from the norm.

For example, a user might write reports to validate the following:

  • Every top-of-rack switch has a console and out-of-band connection defined in NetBox
  • There is a minimum amount of free IP space available within each site
  • All devices whose name match a pattern have been assigned the same functional role
  • Every router has a loopback interface configured with an IP address

A report would take the form of a Python class saved to a file within a parent reports directory (which would not be tracked by git) in the NetBox installation path. Each report class can have several methods, each of which might perform specific validation relevant to the report's purpose. This arrangement closely mimics the implementation of Python unit tests: The major difference is that we are validating data rather than code.

Reports would be executed via the API, with individual methods being run in the order they are defined. A management command (e.g. manage.py runreport <name>) will also be provided for development purposes and for execution by cron jobs.

Each report method can produce logs and ultimately yield a pass for fail status; if one or more tests fail, the report is marked as failed. Results of the most recent test runs will be stored in the database as raw JSON, but no historical tracking will be provided. The web UI will provide a view showing the latest results of each report.

Originally created by @jeremystretch on GitHub (Sep 19, 2017). ### Issue type [x] Feature request <!-- Requesting the implementation of a new feature --> [ ] Bug report <!-- Reporting unexpected or erroneous behavior --> [ ] Documentation <!-- Proposing a modification to the documentation --> ### Environment * Python version: 3.4.3 * NetBox version: 2.1.4 ### Description NetBox is intended to serve as the "source of truth" for a network, acting as the authoritative source for IP addressing, interface connections, and so on. To help guarantee the integrity of data within NetBox, I'd like to establish a mechanism by which users can write and run custom reports to inspect NetBox objects and alert on any deviations from the norm. For example, a user might write reports to validate the following: * Every top-of-rack switch has a console and out-of-band connection defined in NetBox * There is a minimum amount of free IP space available within each site * All devices whose name match a pattern have been assigned the same functional role * Every router has a loopback interface configured with an IP address A report would take the form of a Python class saved to a file within a parent `reports` directory (which would not be tracked by git) in the NetBox installation path. Each report class can have several methods, each of which might perform specific validation relevant to the report's purpose. This arrangement closely mimics the implementation of Python unit tests: The major difference is that we are validating data rather than code. Reports would be executed via the API, with individual methods being run in the order they are defined. A management command (e.g. `manage.py runreport <name>`) will also be provided for development purposes and for execution by cron jobs. Each report method can produce logs and ultimately yield a pass for fail status; if one or more tests fail, the report is marked as failed. Results of the most recent test runs will be stored in the database as raw JSON, but no historical tracking will be provided. The web UI will provide a view showing the latest results of each report.
adam closed this issue 2025-12-29 16:30:33 +01:00
Author
Owner

@candlerb commented on GitHub (Sep 26, 2017):

Each report method can produce logs and ultimately yield a pass for fail status

A couple of queries.

  1. Regarding reports which highlight data inconsistences as described above. Rather than returning a single "FAIL" status, or a message about the first failed instance, are you expecting it would be possible to return a structured table of all failure instances - e.g. as a CSV? (Aside: for some use cases see #801, #863)

  2. Are you expecting that reports could be used as a general data summary or export facility - e.g. you could use this to run a query to generate tables of stats grouped by site, rack etc, but no "failures" as such? Or is that considered a different type of "report"?

@candlerb commented on GitHub (Sep 26, 2017): > Each report method can produce logs and ultimately yield a pass for fail status A couple of queries. 1. Regarding reports which highlight data inconsistences as described above. Rather than returning a single "FAIL" status, or a message about the first failed instance, are you expecting it would be possible to return a structured table of *all* failure instances - e.g. as a CSV? (Aside: for some use cases see #801, #863) 2. Are you expecting that reports could be used as a general data summary or export facility - e.g. you could use this to run a query to generate tables of stats grouped by site, rack etc, but no "failures" as such? Or is that considered a different type of "report"?
Author
Owner

@jeremystretch commented on GitHub (Sep 28, 2017):

Rather than returning a single "FAIL" status, or a message about the first failed instance, are you expecting it would be possible to return a structured table of all failure instances - e.g. as a CSV?

Users will able to log arbitrary messages within a report. Each message can be associated with a log level: success, info, warning, or failure. A report with one or more failures logged is considered to have failed.

Are you expecting that reports could be used as a general data summary or export facility - e.g. you could use this to run a query to generate tables of stats grouped by site, rack etc, but no "failures" as such?

Probably not. I mean, you could use reports for that, but it would be impractical to accommodate the plethora of different output formats and structures people might want to use. I think the primary focus here will be validation of the data within NetBox, in support of its function as the "source of truth." The reports page will provide a quick summary of any data in NetBox which does not conform to rules the user has defined.

@jeremystretch commented on GitHub (Sep 28, 2017): > Rather than returning a single "FAIL" status, or a message about the first failed instance, are you expecting it would be possible to return a structured table of all failure instances - e.g. as a CSV? Users will able to log arbitrary messages within a report. Each message can be associated with a log level: success, info, warning, or failure. A report with one or more failures logged is considered to have failed. > Are you expecting that reports could be used as a general data summary or export facility - e.g. you could use this to run a query to generate tables of stats grouped by site, rack etc, but no "failures" as such? Probably not. I mean, you _could_ use reports for that, but it would be impractical to accommodate the plethora of different output formats and structures people might want to use. I think the primary focus here will be validation of the data within NetBox, in support of its function as the "source of truth." The reports page will provide a quick summary of any data in NetBox which does not conform to rules the user has defined.
Author
Owner

@jeremystretch commented on GitHub (Sep 28, 2017):

The reports branch has been merged into develop-2.2 and will be included in v2.2-beta2.

@jeremystretch commented on GitHub (Sep 28, 2017): The reports branch has been merged into develop-2.2 and will be included in v2.2-beta2.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: starred/netbox#1245