Compare commits

..

67 Commits

Author SHA1 Message Date
Mark Coleman
b6548d941b Add documentation for {module_path} placeholder
Per arthanson's review request, updated docs/models/dcim/moduletype.md
to document:
- {module} placeholder behavior (single vs multiple use)
- {module_path} placeholder for full path expansion
- Position field resolution for nested module bays
2026-01-20 18:47:02 +01:00
Mark Coleman
898fe8b3d8 Address sigprof's Jan 20 feedback
1. Add validation to reject mixing {module} and {module_path} in same attribute
2. Refactor resolve_position() to match resolve_name()/resolve_label() pattern
   - Moved to ModuleBayTemplate where it can access self.position directly
   - No longer takes position as argument
3. Added test for mixed placeholder validation
2026-01-20 10:15:12 +01:00
Mark Coleman
3680b0ccd4 Refactor: move resolve_module_placeholders from constants.py to utils.py
Constants should only contain constant values, not functions with logic.
The helper function now lives in dcim/utils.py alongside other utilities
like update_interface_bridges and create_port_mappings.
2026-01-19 19:06:42 +01:00
Mark Coleman
702b1f8210 Implement Option 2: {module_path} for full path, {module} for parent-only
Per sigprof's feedback, this implements two distinct placeholders:

- {module_path}: Always expands to full /-separated path (e.g., 1/2/3)
  Use case: Generic modules like SFPs that work at any depth

- {module} (single): Expands to parent bay position only
  Use case: Building custom paths via position field with user-controlled separators

- {module}/{module}: Level-by-level substitution (unchanged for backwards compat)

This design allows two ways to build module hierarchies:
1. Use {module_path} for automatic path joining (hardcodes / separator)
2. Use position field with {module} for custom separators

Fixes #20474, #20467, #19796
2026-01-19 18:20:59 +01:00
Mark Coleman
1c6adc40b3 Refactor: centralize module token substitution logic
Per sigprof's review feedback, extract the duplicated token substitution
logic into a single resolve_module_token() helper in constants.py.

This addresses two review comments:
1. Duplication between ModuleCommonForm.clean() and resolve_name()
2. Duplication between resolve_name() and resolve_label()

Benefits:
- Single source of truth for substitution logic
- MODULE_TOKEN_SEPARATOR constant for future configurability
- Cleaner, more maintainable code (-7 net lines)
- Easier to modify separator handling in one place
2026-01-19 16:18:54 +01:00
Mark Coleman
bcd3851f4e Address sigprof review: stricter token validation
Per sigprof's feedback, the previous validation (depth >= token_count)
allowed a questionable case where token_count > 1 but < depth, which
would lose position information for some levels.

New validation: token_count must be either 1 (full path expansion) or
exactly match the tree depth (level-by-level substitution).

Updated test T2 to verify this mismatched case is now rejected.
2026-01-19 16:18:54 +01:00
Mark Coleman
850bfba9e4 Fix PEP8: remove trailing whitespace from blank lines 2026-01-19 16:18:54 +01:00
Mark Coleman
1df6eee467 Add position field resolution for module bays (fixes #20467)
- Add resolve_position() method to ModularComponentTemplateModel
- Update ModuleBayTemplate.instantiate() to resolve {module} in position field
- Add test_module_bay_position_resolves_placeholder test

This completes the fix for nested module placeholder issues by ensuring
the position field also resolves {module} placeholders, which is required
for building correct full paths in 3+ level hierarchies.
2026-01-19 16:18:54 +01:00
Mark Coleman
e613b55ada Fix nested module bay placeholder: single {module} resolves to full path (e.g., 1/1) 2026-01-19 16:18:54 +01:00
Jeremy Stretch
6f2ba5c75c Merge branch 'main' into feature 2026-01-06 13:05:07 -05:00
Jeremy Stretch
fa8a9ef9de Release v4.4.10 2026-01-06 12:30:03 -05:00
Jeremy Stretch
6beb079b97 Revert "Fixed #20950: Add missing module and device properties in module-bay (#21005)"
This reverts commit 860db9590b.
2026-01-06 10:38:41 -05:00
bctiemann
bad688b8aa Merge pull request #21069 from netbox-community/21067-cable-profile-error
Fixes #21067: Force update of cable terminations when changing cable profile
2026-01-06 09:48:54 -05:00
github-actions
c8aad24a1b Update source translation strings 2026-01-06 05:04:58 +00:00
bctiemann
42bd876604 Merge pull request #21072 from netbox-community/21071-exception-request-url
Closes #21071: Include the request method & URL when displaying a server error
2026-01-05 20:20:46 -05:00
bctiemann
f903442cb9 Merge pull request #21065 from netbox-community/21049-clean-stale-cf-data
Fixes #21049: Remove stale custom field data during object validation
2026-01-05 20:19:46 -05:00
Jason Novinger
5a64cb712d Fixes #21064: Ensures that extra choices preserve nested colons 2026-01-05 16:38:16 -05:00
Jason Novinger
4d90d559be Fix permission constraint example error 2026-01-05 16:33:21 -05:00
Jeremy Stretch
19de058f94 Closes #21071: Include the request method & URL when displaying a server error 2026-01-05 16:09:39 -05:00
Jeremy Stretch
d3e4c02807 Fixes #21067: Force update of cable terminations when changing cable profile 2026-01-05 15:14:04 -05:00
Jeremy Stretch
dc00e19c3c Fixes #21063: Check for duplicate choice values when validating a custom field choice set (#21066) 2026-01-05 13:10:04 -06:00
Jeremy Stretch
6ed6da49d9 Update test 2026-01-05 11:00:54 -05:00
Prince Kumar
7154d4ae2e Closes #20953: Show interfaces bridged to an interface in the UI (#21010) 2026-01-05 09:40:38 -06:00
Jeremy Stretch
bc26529be8 Fixes #21049: Remove stale custom field data during object validation 2026-01-05 09:49:32 -05:00
github-actions
da64c564ae Update source translation strings 2026-01-01 05:07:03 +00:00
Jeremy Stretch
6199b3e039 FIxes #19506: Add filter forms for component templates (#21057)
Co-authored-by: Callum <callum@reja.au>
Co-authored-by: Callum <96725140+callumau@users.noreply.github.com>
2025-12-31 09:50:39 -06:00
Jeremy Stretch
ebada4bf72 Closes #21001: Annotate plugin filterset registration in v4.5 release notes (#21058) 2025-12-31 09:42:47 -06:00
github-actions
2a391253a5 Update source translation strings 2025-12-31 05:05:09 +00:00
Jason Novinger
914653d63e Fixes #21045: Allow saving Site with associated Prefix
This was a result of the fix for #20944 optimizing a query to only
include the `id` field with `.only(id)`. Since `Prefix.__init__()`
caches original values from other fields (`_prefix` and `_vrf_id`),
these cached values are `None` at init-time.

This might not normally be a problem, but the sequence of events in
the bug report also end up causing the `handle_prefix_saved` handler
to run, which uses an ORM lookup, (either `net_contained_or_equal`
original`net_contained`) that does not support a query argument of
`None`.
2025-12-30 12:26:48 -05:00
Martin Hauser
3813aad8b1 Fixes #20320: Ensure related interface options availibility in bulk edit (#21006) 2025-12-30 10:17:14 -06:00
Jeremy Stretch
ea5371040e Fixes #20817: Re-enable sync button when disabling scheduled syncing for a data source (#21055) 2025-12-30 10:05:08 -06:00
Unknown
6c824cc48f Fixes #20044: Elevations stuck in light mode (#21037)
Co-authored-by: UnknownTy <meaphunter+git@hotmail.com>
Co-authored-by: Jason Novinger <jnovinger@gmail.com>
2025-12-29 16:27:03 -06:00
Jeremy Stretch
c78b8401dc Fixes #21020: Fix object filtering for image attachments panel (#21030) 2025-12-29 15:19:24 -06:00
Jeremy Stretch
f510e40428 Closes #21047: Add compatibility matrix to plugin setup instructions (#21048) 2025-12-29 11:39:51 -06:00
Prince Kumar
860db9590b Fixed #20950: Add missing module and device properties in module-bay (#21005) 2025-12-23 13:34:06 -06:00
Jeremy Stretch
7c63d001b1 Release v4.4.9 2025-12-23 12:02:30 -05:00
Jeremy Stretch
93119f52c3 Fixes #21032: Avoid subquery in RestrictedQuerySet where unnecessary 2025-12-23 10:15:06 -05:00
github-actions
ee2aa35cba Update source translation strings 2025-12-23 05:04:20 +00:00
bctiemann
7896a48075 Merge pull request #21029 from netbox-community/21011-configrevision-save
Fixes #21011: Avoid updating database when loading active ConfigRevision
2025-12-22 14:19:19 -05:00
bctiemann
eb87c3f304 Merge pull request #21000 from netbox-community/20011-misleading-error-message
Fixes #20011: Provide accurate error for bulk import duplicate IDs
2025-12-22 14:12:36 -05:00
Vincent Simonin
3acbb0a08c Fix on delete cascade entity order (#20949)
* Fix on delete cascade entity order

Since [#20708](https://github.com/netbox-community/netbox/pull/20708)
relation with a on delete RESTRICT are not deleted in the proper order.
Then the error `violate not-null constraint` occurs and breaks the
delete cascade feature.

* Revert unrelated and simplify changes
2025-12-22 13:19:02 -05:00
Jeremy Stretch
f67cc47def Fixes #21011: Avoid updating database when loading active ConfigRevision 2025-12-22 11:00:04 -05:00
Martin Hauser
f7219e0672 Closes #20309: Add ASDOT notation support for ASN ranges (#21004)
* feat(ipam): Add ASDOT notation support for ASN ranges

Introduces ASDOT notation for ASN Ranges to improve readability of large
AS numbers. Adds `start_asdot` and `end_asdot` properties, columns, and
display logic for ASN ranges in the UI.

Fixes #20309

* Wrap "ASDOT" with parentheses in column header

---------

Co-authored-by: Jeremy Stretch <jstretch@netboxlabs.com>
2025-12-22 10:06:08 -05:00
Prince Kumar
e5a975176d Fixed #20944: Ensure cached scope fields stay consistent when Region, Site, or Location changes (#20986) 2025-12-22 09:48:43 -05:00
github-actions
83ee4fb593 Update source translation strings 2025-12-20 05:02:02 +00:00
bctiemann
db8271c904 Fixes #20114: Preserve parent bay during device bulk import when tags are present (#21019) 2025-12-19 17:05:32 -06:00
github-actions
5a24f99c9d Update source translation strings 2025-12-18 05:03:18 +00:00
Jeremy Stretch
9318c91405 Closes #20720: Add support for Latvian translations (#21003) 2025-12-17 15:20:04 -06:00
Martin Hauser
5c6aaf2388 Closes #20900: Allow multiple choices in CustomField select filter fields (#20992) 2025-12-17 14:32:46 -06:00
Jason Novinger
265f375595 Fixes #20876: Allow editing IPAddress in IPRange marked populated 2025-12-17 13:03:45 -05:00
Jason Novinger
d95fa8dbb2 Fixes #20011: UI Error msg for duplicate IDs in bulk import 2025-12-17 09:21:17 -06:00
bctiemann
2699149016 Merge pull request #20963 from pheus/20491-normalize-arrayfield-values-to-inclusive-pairs-for-api-tests
Fixes #20491: Normalize numeric range array fields for API test comparisons
2025-12-16 15:40:44 -05:00
vo42
f371004809 Fixes #20969: Fix FrontPortTemplateFilterSet rear_port_id queryset. (#20987) 2025-12-16 11:23:18 -08:00
github-actions
ad29402b87 Update source translation strings 2025-12-13 05:02:00 +00:00
Jason Novinger
598f8d034d Fixes #20912: Clear ModuleBay parent when module assignment removed (#20974) 2025-12-12 13:31:59 -08:00
Arthur Hanson
ec13a79907 Fixes #20875: Fix updating of denormalized fields for component models (#20956) 2025-12-12 13:29:34 -06:00
github-actions
21f4036782 Update source translation strings 2025-12-12 05:03:16 +00:00
bctiemann
ce3738572c Merge pull request #20967 from netbox-community/20966-remove-stick-scroll
Fixes #20966: Fix broken optgroup stickiness in ObjectType multiselect
2025-12-11 19:44:16 -05:00
bctiemann
cbb979934e Merge pull request #20958 from netbox-community/17976-manufacturer-devicetype_count
Fixes #17976: Remove devicetype_count from nested manufacturer to correct OpenAPI schema
2025-12-11 19:42:26 -05:00
bctiemann
642d83a4c6 Merge pull request #20937 from netbox-community/20560-bulk-import-prefix
Fixes #20560: Fix VLAN disambiguation in prefix bulk import
2025-12-11 19:40:59 -05:00
Jason Novinger
a06c12c6b8 Fixes #20966: Fix broken optgroup stickiness in ObjectType multiselect 2025-12-11 08:59:16 -06:00
Martin Hauser
60fce84c96 feat(ipam): Normalize numeric ranges in API output
Adds logic to handle numeric range fields in API responses by
converting them into inclusive `[low, high]` pairs for consistent
behavior. Updates test cases with `vid_ranges` fields to reflect the
changes.

Closes #20491
2025-12-10 21:11:23 +01:00
Jeremy Stretch
59afa0b41d Fix test 2025-12-10 09:01:11 -05:00
Jeremy Stretch
14b246cb8a Fixes #17976: Remove devicetype_count from nested manufacturer to correct OpenAPI schema 2025-12-10 08:23:48 -05:00
github-actions
f0507d00bf Update source translation strings 2025-12-10 05:02:48 +00:00
Arthur Hanson
77b389f105 Fixes #20873: fix webhooks with image fields (#20955) 2025-12-09 22:06:11 -06:00
Jason Novinger
9ae53fc232 Fixes #20560: Fix VLAN disambiguation in prefix bulk import 2025-12-05 16:39:28 -06:00
122 changed files with 31122 additions and 7928 deletions

View File

@@ -15,7 +15,7 @@ body:
attributes:
label: NetBox version
description: What version of NetBox are you currently running?
placeholder: v4.4.8
placeholder: v4.4.10
validations:
required: true
- type: dropdown

View File

@@ -27,7 +27,7 @@ body:
attributes:
label: NetBox Version
description: What version of NetBox are you currently running?
placeholder: v4.4.8
placeholder: v4.4.10
validations:
required: true
- type: dropdown

View File

@@ -5,7 +5,7 @@
<a href="https://github.com/netbox-community/netbox/blob/main/LICENSE.txt"><img src="https://img.shields.io/badge/license-Apache_2.0-blue.svg" alt="License" /></a>
<a href="https://github.com/netbox-community/netbox/graphs/contributors"><img src="https://img.shields.io/github/contributors/netbox-community/netbox?color=blue" alt="Contributors" /></a>
<a href="https://github.com/netbox-community/netbox/stargazers"><img src="https://img.shields.io/github/stars/netbox-community/netbox?style=flat" alt="GitHub stars" /></a>
<a href="https://explore.transifex.com/netbox-community/netbox/"><img src="https://img.shields.io/badge/languages-15-blue" alt="Languages supported" /></a>
<a href="https://explore.transifex.com/netbox-community/netbox/"><img src="https://img.shields.io/badge/languages-16-blue" alt="Languages supported" /></a>
<a href="https://github.com/netbox-community/netbox/actions/workflows/ci.yml"><img src="https://github.com/netbox-community/netbox/actions/workflows/ci.yml/badge.svg" alt="CI status" /></a>
<p>
<strong><a href="https://netboxlabs.com/community/">NetBox Community</a></strong> |

View File

@@ -35,11 +35,6 @@ django-mptt==0.17.0
# https://github.com/Xof/django-pglocks/blob/master/CHANGES.txt
django-pglocks
# Manager for managing PostgreSQL triggers
# https://github.com/AmbitionEng/django-pgtrigger/blob/main/CHANGELOG.md
django-pgtrigger
# Prometheus metrics library for Django
# https://github.com/korfuri/django-prometheus/blob/master/CHANGELOG.md
django-prometheus

File diff suppressed because it is too large Load Diff

View File

@@ -88,7 +88,7 @@ While permissions are typically assigned to specific groups and/or users, it is
### Viewing Objects
Object-based permissions work by filtering the database query generated by a user's request to restrict the set of objects returned. When a request is received, NetBox first determines whether the user is authenticated and has been granted to perform the requested action. For example, if the requested URL is `/dcim/devices/`, NetBox will check for the `dcim.view_device` permission. If the user has not been assigned this permission (either directly or via a group assignment), NetBox will return a 403 (forbidden) HTTP response.
Object-based permissions work by filtering the database query generated by a user's request to restrict the set of objects returned. When a request is received, NetBox first determines whether the user is authenticated and has been granted permission to perform the requested action. For example, if the requested URL is `/dcim/devices/`, NetBox will check for the `dcim.view_device` permission. If the user has not been assigned this permission (either directly or via a group assignment), NetBox will return a 403 (forbidden) HTTP response.
If the permission _has_ been granted, NetBox will compile any specified constraints for the model and action. For example, suppose two permissions have been assigned to the user granting view access to the device model, with the following constraints:
@@ -102,9 +102,9 @@ If the permission _has_ been granted, NetBox will compile any specified constrai
This grants the user access to view any device that is assigned to a site named NYC1 or NYC2, **or** which has a status of "offline" and has no tenant assigned. These constraints are equivalent to the following ORM query:
```no-highlight
Site.objects.filter(
Device.objects.filter(
Q(site__name__in=['NYC1', 'NYC2']),
Q(status='active', tenant__isnull=True)
Q(status='offline', tenant__isnull=True)
)
```

View File

@@ -16,9 +16,33 @@ Note that device bays and module bays may _not_ be added to modules.
## Automatic Component Renaming
When adding component templates to a module type, the string `{module}` can be used to reference the `position` field of the module bay into which an instance of the module type is being installed.
When adding component templates to a module type, placeholders can be used to dynamically incorporate the module bay's `position` field into component names. Two placeholders are available:
For example, you can create a module type with interface templates named `Gi{module}/0/[1-48]`. When a new module of this type is "installed" to a module bay with a position of "3", NetBox will automatically name these interfaces `Gi3/0/[1-48]`.
### `{module}` Placeholder
The `{module}` placeholder references the position of the parent module bay:
* **Single use**: Expands to the immediate parent's position only
* **Multiple uses**: Each `{module}` token is replaced level-by-level (the number of tokens must match the nesting depth)
For example, a module type with interface templates named `Gi{module}/0/[1-48]`, when installed in a module bay with position "3", will create interfaces named `Gi3/0/[1-48]`.
### `{module_path}` Placeholder
The `{module_path}` placeholder expands to the full path from the root device to the current module, with positions joined by `/`. This is useful for modules that can be installed at any nesting depth without modification.
For example, consider an SFP module type with an interface template named `eth{module_path}`:
* Installed directly in slot 2: creates interface `eth2`
* Installed in slot 1's nested bay 1: creates interface `eth1/1`
* Installed in slot 1's nested bay 2's sub-bay 3: creates interface `eth1/2/3`
!!! note
`{module_path}` can only be used once per template attribute, and cannot be mixed with `{module}` in the same attribute.
### Position Field Resolution
The `{module}` placeholder can also be used in the `position` field of [module bay templates](./modulebaytemplate.md) defined on a module type. This allows nested module bays to build hierarchical position values. For example, a module bay template with `position="{module}/1"`, when its parent module is installed in a bay with position "2", will have its position resolved to "2/1".
Automatic renaming is supported for all modular component types (those listed above).

View File

@@ -74,7 +74,7 @@ The plugin source directory contains all the actual Python code and other resour
The `PluginConfig` class is a NetBox-specific wrapper around Django's built-in [`AppConfig`](https://docs.djangoproject.com/en/stable/ref/applications/) class. It is used to declare NetBox plugin functionality within a Python package. Each plugin should provide its own subclass, defining its name, metadata, and default and required configuration parameters. An example is below:
```python
```python title="__init__.py"
from netbox.plugins import PluginConfig
class FooBarConfig(PluginConfig):
@@ -151,7 +151,7 @@ Any additional apps must be installed within the same Python environment as NetB
An example `pyproject.toml` is below:
```
```toml title="pyproject.toml"
# See PEP 518 for the spec of this file
# https://www.python.org/dev/peps/pep-0518/
@@ -179,11 +179,24 @@ classifiers=[
]
requires-python = ">=3.12.0"
```
Many of these are self-explanatory, but for more information, see the [pyproject.toml documentation](https://packaging.python.org/en/latest/specifications/pyproject-toml/).
## Compatibility Matrix
Consider adding a file named `COMPATIBILITY.md` to your plugin project root (alongside `pyproject.toml`). This file should contain a table listing the minimum and maximum supported versions of NetBox (`min_version` and `max_version`) for each release. This serves as a handy reference for users who are upgrading from a previous version of your plugin. An example is shown below:
```markdown title="COMPATIBILITY.md"
# Compatibility Matrix
| Release | Minimum NetBox Version | Maximum NetBox Version |
|---------|------------------------|------------------------|
| 0.2.0 | 4.4.0 | 4.5.x |
| 0.1.1 | 4.3.0 | 4.4.x |
| 0.1.0 | 4.3.0 | 4.4.x |
```
## Create a Virtual Environment
It is strongly recommended to create a Python [virtual environment](https://docs.python.org/3/tutorial/venv.html) for the development of your plugin, as opposed to using system-wide packages. This will afford you complete control over the installed versions of all dependencies and avoid conflict with system packages. This environment can live wherever you'd like;however, it should be excluded from revision control. (A popular convention is to keep all virtual environments in the user's home directory, e.g. `~/.virtualenvs/`.)

View File

@@ -1,5 +1,51 @@
# NetBox v4.4
## v4.4.10 (2026-01-06)
### Enhancements
* [#20953](https://github.com/netbox-community/netbox/issues/20953) - Show reverse bridge relationships on interface detail pages
* [#21071](https://github.com/netbox-community/netbox/issues/21071) - Include request method & URL when displaying server errors
### Bug Fixes
* [#19506](https://github.com/netbox-community/netbox/issues/19506) - Add filter forms for component templates to ensure object selector support
* [#20044](https://github.com/netbox-community/netbox/issues/20044) - Fix dark mode support for rack elevations
* [#20320](https://github.com/netbox-community/netbox/issues/20320) - Restore support for selecting related interfaces when bulk editing device interfaces
* [#20817](https://github.com/netbox-community/netbox/issues/20817) - Re-enable sync button when disabling scheduled syncing for a data source
* [#21045](https://github.com/netbox-community/netbox/issues/21045) - Fix `ValueError` exception when saving a site with an assigned prefix
* [#21049](https://github.com/netbox-community/netbox/issues/21049) - Ignore stale custom field data when validating an object
* [#21063](https://github.com/netbox-community/netbox/issues/21063) - Check for duplicate choice values when validating a custom field choice set
* [#21064](https://github.com/netbox-community/netbox/issues/21064) - Ensures that extra choices in custom field choice sets preserve escaped colons
---
## v4.4.9 (2025-12-23)
### Enhancements
* [#20309](https://github.com/netbox-community/netbox/issues/20309) - Support ASDOT notation for ASN ranges
* [#20720](https://github.com/netbox-community/netbox/issues/20720) - Add Latvian translations
* [#20900](https://github.com/netbox-community/netbox/issues/20900) - Allow filtering custom choice fields by multiple values in the UI
### Bug Fixes
* [#17976](https://github.com/netbox-community/netbox/issues/17976) - Remove `devicetype_count` from nested manufacturer to correct OpenAPI schema
* [#20011](https://github.com/netbox-community/netbox/issues/20011) - Provide a clear message when encountering duplicate object IDs during bulk import
* [#20114](https://github.com/netbox-community/netbox/issues/20114) - Preserve `parent_bay` during device bulk import when tags are present
* [#20491](https://github.com/netbox-community/netbox/issues/20491) - Improve handling of numeric ranges in tests
* [#20873](https://github.com/netbox-community/netbox/issues/20873) - Fix `AttributeError` exception triggered by event rules associated with an object that supports file attachments
* [#20875](https://github.com/netbox-community/netbox/issues/20875) - Ensure that parent object relations are cached (for filtering) on device/module components during instantiation
* [#20876](https://github.com/netbox-community/netbox/issues/20876) - Allow editing an IP address that resides within a range marked as populated
* [#20912](https://github.com/netbox-community/netbox/issues/20912) - Fix inconsistent clearing of `module` field on ModuleBay
* [#20944](https://github.com/netbox-community/netbox/issues/20944) - Ensure cached scope is updated on child objects when a parent region/site/location is changed
* [#20948](https://github.com/netbox-community/netbox/issues/20948) - Handle the deletion of related objects with `on_delete=RESTRICT` the same as `CASCADE`
* [#20969](https://github.com/netbox-community/netbox/issues/20969) - Fix querying of front port templates by `rear_port_id`
* [#21011](https://github.com/netbox-community/netbox/issues/21011) - Avoid writing to the database when loading active ConfigRevision
* [#21032](https://github.com/netbox-community/netbox/issues/21032) - Avoid SQL subquery in RestrictedQuerySet where unnecessary
---
## v4.4.8 (2025-12-09)
### Enhancements

View File

@@ -22,7 +22,7 @@
#### Lookup Modifiers in Filter Forms ([#7604](https://github.com/netbox-community/netbox/issues/7604))
Most object list filters within the UI have been extended to include optional lookup modifiers to support more complex queries. For instance, filters for numeric values now include a dropdown where a user can select "less than," "greater than," or "not" in addition to the default equivalency match. The specific modifiers available depend on the type of each filter.
Most object list filters within the UI have been extended to include optional lookup modifiers to support more complex queries. For instance, filters for numeric values now include a dropdown where a user can select "less than," "greater than," or "not" in addition to the default equivalency match. The specific modifiers available depend on the type of each filter. Plugins can register their own filtersets using the `register_filterset()` decorator to enable this new functionality.
(Note that this feature does not introduce any new filters. Rather, it makes available in the UI filters which already exist.)

View File

@@ -63,16 +63,20 @@ class ConfigRevision(models.Model):
return reverse('core:config') # Default config view
return reverse('core:configrevision', args=[self.pk])
def activate(self):
def activate(self, update_db=True):
"""
Cache the configuration data.
Parameters:
update_db: Mark the ConfigRevision as active in the database (default: True)
"""
cache.set('config', self.data, None)
cache.set('config_version', self.pk, None)
# Set all instances of ConfigRevision to false and set this instance to true
ConfigRevision.objects.all().update(active=False)
ConfigRevision.objects.filter(pk=self.pk).update(active=True)
if update_db:
# Set all instances of ConfigRevision to false and set this instance to true
ConfigRevision.objects.all().update(active=False)
ConfigRevision.objects.filter(pk=self.pk).update(active=True)
activate.alters_data = True

View File

@@ -131,6 +131,19 @@ class DataSource(JobsMixin, PrimaryModel):
'source_url': "URLs for local sources must start with file:// (or specify no scheme)"
})
def save(self, *args, **kwargs):
# If recurring sync is disabled for an existing DataSource, clear any pending sync jobs for it and reset its
# "queued" status
if not self._state.adding and not self.sync_interval:
self.jobs.filter(status=JobStatusChoices.STATUS_PENDING).delete()
if self.status == DataSourceStatusChoices.QUEUED and self.last_synced:
self.status = DataSourceStatusChoices.COMPLETED
elif self.status == DataSourceStatusChoices.QUEUED:
self.status = DataSourceStatusChoices.NEW
super().save(*args, **kwargs)
def to_objectchange(self, action):
objectchange = super().to_objectchange(action)

View File

@@ -3,7 +3,7 @@ from threading import local
from django.contrib.contenttypes.models import ContentType
from django.core.exceptions import ObjectDoesNotExist, ValidationError
from django.db.models import CASCADE
from django.db.models import CASCADE, RESTRICT
from django.db.models.fields.reverse_related import ManyToManyRel, ManyToOneRel
from django.db.models.signals import m2m_changed, post_migrate, post_save, pre_delete
from django.dispatch import receiver, Signal
@@ -221,7 +221,7 @@ def handle_deleted_object(sender, instance, **kwargs):
obj.snapshot() # Ensure the change record includes the "before" state
if type(relation) is ManyToManyRel:
getattr(obj, related_field_name).remove(instance)
elif type(relation) is ManyToOneRel and relation.null and relation.on_delete is not CASCADE:
elif type(relation) is ManyToOneRel and relation.null and relation.on_delete not in (CASCADE, RESTRICT):
setattr(obj, related_field_name, None)
obj.save()

View File

@@ -1,5 +1,5 @@
from django.utils.translation import gettext as _
from django.contrib.contenttypes.models import ContentType
from django.utils.translation import gettext as _
from rest_framework import serializers
from dcim.choices import *
@@ -208,6 +208,7 @@ class InterfaceSerializer(
type = ChoiceField(choices=InterfaceTypeChoices)
parent = NestedInterfaceSerializer(required=False, allow_null=True)
bridge = NestedInterfaceSerializer(required=False, allow_null=True)
bridge_interfaces = NestedInterfaceSerializer(many=True, read_only=True)
lag = NestedInterfaceSerializer(required=False, allow_null=True)
mode = ChoiceField(choices=InterfaceModeChoices, required=False, allow_blank=True)
duplex = ChoiceField(choices=InterfaceDuplexChoices, required=False, allow_blank=True, allow_null=True)
@@ -247,13 +248,13 @@ class InterfaceSerializer(
model = Interface
fields = [
'id', 'url', 'display_url', 'display', 'device', 'vdcs', 'module', 'name', 'label', 'type', 'enabled',
'parent', 'bridge', 'lag', 'mtu', 'mac_address', 'primary_mac_address', 'mac_addresses', 'speed', 'duplex',
'wwn', 'mgmt_only', 'description', 'mode', 'rf_role', 'rf_channel', 'poe_mode', 'poe_type',
'rf_channel_frequency', 'rf_channel_width', 'tx_power', 'untagged_vlan', 'tagged_vlans', 'qinq_svlan',
'vlan_translation_policy', 'mark_connected', 'cable', 'cable_end', 'wireless_link', 'link_peers',
'link_peers_type', 'wireless_lans', 'vrf', 'l2vpn_termination', 'connected_endpoints',
'connected_endpoints_type', 'connected_endpoints_reachable', 'owner', 'tags', 'custom_fields', 'created',
'last_updated', 'count_ipaddresses', 'count_fhrp_groups', '_occupied',
'parent', 'bridge', 'bridge_interfaces', 'lag', 'mtu', 'mac_address', 'primary_mac_address',
'mac_addresses', 'speed', 'duplex', 'wwn', 'mgmt_only', 'description', 'mode', 'rf_role', 'rf_channel',
'poe_mode', 'poe_type', 'rf_channel_frequency', 'rf_channel_width', 'tx_power', 'untagged_vlan',
'tagged_vlans', 'qinq_svlan', 'vlan_translation_policy', 'mark_connected', 'cable', 'cable_end',
'wireless_link', 'link_peers', 'link_peers_type', 'wireless_lans', 'vrf', 'l2vpn_termination',
'connected_endpoints', 'connected_endpoints_type', 'connected_endpoints_reachable', 'owner', 'tags',
'custom_fields', 'created', 'last_updated', 'count_ipaddresses', 'count_fhrp_groups', '_occupied',
]
brief_fields = ('id', 'url', 'display', 'device', 'name', 'description', 'cable', '_occupied')

View File

@@ -22,4 +22,4 @@ class ManufacturerSerializer(OrganizationalModelSerializer):
'custom_fields', 'created', 'last_updated', 'devicetype_count', 'moduletype_count', 'inventoryitem_count',
'platform_count',
]
brief_fields = ('id', 'url', 'display', 'name', 'slug', 'description', 'devicetype_count')
brief_fields = ('id', 'url', 'display', 'name', 'slug', 'description')

View File

@@ -79,6 +79,8 @@ NONCONNECTABLE_IFACE_TYPES = VIRTUAL_IFACE_TYPES + WIRELESS_IFACE_TYPES
#
MODULE_TOKEN = '{module}'
MODULE_PATH_TOKEN = '{module_path}'
MODULE_TOKEN_SEPARATOR = '/'
MODULAR_COMPONENT_TEMPLATE_MODELS = Q(
app_label='dcim',

View File

@@ -905,7 +905,7 @@ class FrontPortTemplateFilterSet(ChangeLoggedModelFilterSet, ModularDeviceTypeCo
)
rear_port_id = django_filters.ModelMultipleChoiceFilter(
field_name='mappings__rear_port',
queryset=RearPort.objects.all(),
queryset=RearPortTemplate.objects.all(),
to_field_name='rear_port',
label=_('Rear port (ID)'),
)

View File

@@ -3,6 +3,7 @@ from django.utils.translation import gettext_lazy as _
from dcim.choices import *
from dcim.constants import *
from dcim.utils import resolve_module_placeholders
from utilities.forms import get_field_value
__all__ = (
@@ -119,25 +120,47 @@ class ModuleCommonForm(forms.Form):
# Get the templates for the module type.
for template in getattr(module_type, templates).all():
resolved_name = template.name
has_module_token = MODULE_TOKEN in template.name
has_module_path_token = MODULE_PATH_TOKEN in template.name
# Installing modules with placeholders require that the bay has a position value
if MODULE_TOKEN in template.name:
if has_module_token or has_module_path_token:
if not module_bay.position:
raise forms.ValidationError(
_("Cannot install module with placeholder values in a module bay with no position defined.")
)
if len(module_bays) != template.name.count(MODULE_TOKEN):
# Cannot mix {module} and {module_path} in the same attribute
if has_module_token and has_module_path_token:
raise forms.ValidationError(
_(
"Cannot install module with placeholder values in a module bay tree {level} in tree "
"but {tokens} placeholders given."
).format(
level=len(module_bays), tokens=template.name.count(MODULE_TOKEN)
)
_("Cannot mix {module} and {module_path} placeholders in the same template attribute.")
)
for module_bay in module_bays:
resolved_name = resolved_name.replace(MODULE_TOKEN, module_bay.position, 1)
# Validate {module_path} - can only appear once
if has_module_path_token:
path_token_count = template.name.count(MODULE_PATH_TOKEN)
if path_token_count > 1:
raise forms.ValidationError(
_("The {module_path} placeholder can only be used once per template.")
)
# Validate {module} - multi-token must match depth exactly
if has_module_token:
token_count = template.name.count(MODULE_TOKEN)
# Multiple {module} tokens must match the tree depth exactly
if token_count > 1 and token_count != len(module_bays):
raise forms.ValidationError(
_(
"Cannot install module with placeholder values in a module bay tree {level} deep "
"but {tokens} placeholders given."
).format(
level=len(module_bays), tokens=token_count
)
)
# Use centralized helper for placeholder substitution
positions = [mb.position for mb in module_bays]
resolved_name = resolve_module_placeholders(resolved_name, positions)
existing_item = installed_components.get(resolved_name)

View File

@@ -27,35 +27,45 @@ __all__ = (
'CableFilterForm',
'ConsoleConnectionFilterForm',
'ConsolePortFilterForm',
'ConsolePortTemplateFilterForm',
'ConsoleServerPortFilterForm',
'ConsoleServerPortTemplateFilterForm',
'DeviceBayFilterForm',
'DeviceBayTemplateFilterForm',
'DeviceFilterForm',
'DeviceRoleFilterForm',
'DeviceTypeFilterForm',
'FrontPortFilterForm',
'FrontPortTemplateFilterForm',
'InterfaceConnectionFilterForm',
'InterfaceFilterForm',
'InterfaceTemplateFilterForm',
'InventoryItemFilterForm',
'InventoryItemTemplateFilterForm',
'InventoryItemRoleFilterForm',
'LocationFilterForm',
'MACAddressFilterForm',
'ManufacturerFilterForm',
'ModuleFilterForm',
'ModuleBayFilterForm',
'ModuleBayTemplateFilterForm',
'ModuleTypeFilterForm',
'ModuleTypeProfileFilterForm',
'PlatformFilterForm',
'PowerConnectionFilterForm',
'PowerFeedFilterForm',
'PowerOutletFilterForm',
'PowerOutletTemplateFilterForm',
'PowerPanelFilterForm',
'PowerPortFilterForm',
'PowerPortTemplateFilterForm',
'RackFilterForm',
'RackElevationFilterForm',
'RackReservationFilterForm',
'RackRoleFilterForm',
'RackTypeFilterForm',
'RearPortFilterForm',
'RearPortTemplateFilterForm',
'RegionFilterForm',
'SiteFilterForm',
'SiteGroupFilterForm',
@@ -1333,6 +1343,23 @@ class PowerFeedFilterForm(TenancyFilterForm, PrimaryModelFilterSetForm):
# Device components
#
class DeviceComponentTemplateFilterForm(NetBoxModelFilterSetForm):
device_type_id = DynamicModelMultipleChoiceField(
queryset=DeviceType.objects.all(),
required=False,
label=_('Device type'),
)
class ModularDeviceComponentTemplateFilterForm(DeviceComponentTemplateFilterForm):
module_type_id = DynamicModelMultipleChoiceField(
queryset=ModuleType.objects.all(),
required=False,
query_params={'manufacturer_id': '$manufacturer_id'},
label=_('Module Type'),
)
class CabledFilterForm(forms.Form):
cabled = forms.NullBooleanField(
label=_('Cabled'),
@@ -1385,6 +1412,20 @@ class ConsolePortFilterForm(PathEndpointFilterForm, DeviceComponentFilterForm):
tag = TagFilterField(model)
class ConsolePortTemplateFilterForm(ModularDeviceComponentTemplateFilterForm):
model = ConsolePortTemplate
fieldsets = (
FieldSet('q', 'filter_id', 'tag'),
FieldSet('name', 'label', 'type', name=_('Attributes')),
FieldSet('device_type_id', 'module_type_id', name=_('Device')),
)
type = forms.MultipleChoiceField(
label=_('Type'),
choices=ConsolePortTypeChoices,
required=False
)
class ConsoleServerPortFilterForm(PathEndpointFilterForm, DeviceComponentFilterForm):
model = ConsoleServerPort
fieldsets = (
@@ -1410,6 +1451,20 @@ class ConsoleServerPortFilterForm(PathEndpointFilterForm, DeviceComponentFilterF
tag = TagFilterField(model)
class ConsoleServerPortTemplateFilterForm(ModularDeviceComponentTemplateFilterForm):
model = ConsoleServerPortTemplate
fieldsets = (
FieldSet('q', 'filter_id', 'tag'),
FieldSet('name', 'label', 'type', name=_('Attributes')),
FieldSet('device_type_id', 'module_type_id', name=_('Device')),
)
type = forms.MultipleChoiceField(
label=_('Type'),
choices=ConsolePortTypeChoices,
required=False
)
class PowerPortFilterForm(PathEndpointFilterForm, DeviceComponentFilterForm):
model = PowerPort
fieldsets = (
@@ -1430,6 +1485,20 @@ class PowerPortFilterForm(PathEndpointFilterForm, DeviceComponentFilterForm):
tag = TagFilterField(model)
class PowerPortTemplateFilterForm(ModularDeviceComponentTemplateFilterForm):
model = PowerPortTemplate
fieldsets = (
FieldSet('q', 'filter_id', 'tag'),
FieldSet('name', 'label', 'type', name=_('Attributes')),
FieldSet('device_type_id', 'module_type_id', name=_('Device')),
)
type = forms.MultipleChoiceField(
label=_('Type'),
choices=PowerPortTypeChoices,
required=False
)
class PowerOutletFilterForm(PathEndpointFilterForm, DeviceComponentFilterForm):
model = PowerOutlet
fieldsets = (
@@ -1459,6 +1528,20 @@ class PowerOutletFilterForm(PathEndpointFilterForm, DeviceComponentFilterForm):
)
class PowerOutletTemplateFilterForm(ModularDeviceComponentTemplateFilterForm):
model = PowerOutletTemplate
fieldsets = (
FieldSet('q', 'filter_id', 'tag'),
FieldSet('name', 'label', 'type', name=_('Attributes')),
FieldSet('device_type_id', 'module_type_id', name=_('Device')),
)
type = forms.MultipleChoiceField(
label=_('Type'),
choices=PowerOutletTypeChoices,
required=False
)
class InterfaceFilterForm(PathEndpointFilterForm, DeviceComponentFilterForm):
model = Interface
fieldsets = (
@@ -1586,6 +1669,51 @@ class InterfaceFilterForm(PathEndpointFilterForm, DeviceComponentFilterForm):
tag = TagFilterField(model)
class InterfaceTemplateFilterForm(ModularDeviceComponentTemplateFilterForm):
model = InterfaceTemplate
fieldsets = (
FieldSet('q', 'filter_id', 'tag'),
FieldSet('name', 'label', 'type', 'enabled', 'mgmt_only', name=_('Attributes')),
FieldSet('poe_mode', 'poe_type', name=_('PoE')),
FieldSet('rf_role', name=_('Wireless')),
FieldSet('device_type_id', 'module_type_id', name=_('Device')),
)
type = forms.MultipleChoiceField(
label=_('Type'),
choices=InterfaceTypeChoices,
required=False
)
enabled = forms.NullBooleanField(
label=_('Enabled'),
required=False,
widget=forms.Select(
choices=BOOLEAN_WITH_BLANK_CHOICES
)
)
mgmt_only = forms.NullBooleanField(
label=_('Management only'),
required=False,
widget=forms.Select(
choices=BOOLEAN_WITH_BLANK_CHOICES
)
)
poe_mode = forms.MultipleChoiceField(
choices=InterfacePoEModeChoices,
required=False,
label=_('PoE mode')
)
poe_type = forms.MultipleChoiceField(
choices=InterfacePoETypeChoices,
required=False,
label=_('PoE type')
)
rf_role = forms.MultipleChoiceField(
choices=WirelessRoleChoices,
required=False,
label=_('Wireless role')
)
class FrontPortFilterForm(CabledFilterForm, DeviceComponentFilterForm):
fieldsets = (
FieldSet('q', 'filter_id', 'tag', 'owner_id'),
@@ -1610,6 +1738,24 @@ class FrontPortFilterForm(CabledFilterForm, DeviceComponentFilterForm):
tag = TagFilterField(model)
class FrontPortTemplateFilterForm(ModularDeviceComponentTemplateFilterForm):
model = FrontPortTemplate
fieldsets = (
FieldSet('q', 'filter_id', 'tag'),
FieldSet('name', 'label', 'type', 'color', name=_('Attributes')),
FieldSet('device_type_id', 'module_type_id', name=_('Device')),
)
type = forms.MultipleChoiceField(
label=_('Type'),
choices=PortTypeChoices,
required=False
)
color = ColorField(
label=_('Color'),
required=False
)
class RearPortFilterForm(CabledFilterForm, DeviceComponentFilterForm):
model = RearPort
fieldsets = (
@@ -1634,6 +1780,24 @@ class RearPortFilterForm(CabledFilterForm, DeviceComponentFilterForm):
tag = TagFilterField(model)
class RearPortTemplateFilterForm(ModularDeviceComponentTemplateFilterForm):
model = RearPortTemplate
fieldsets = (
FieldSet('q', 'filter_id', 'tag'),
FieldSet('name', 'label', 'type', 'color', name=_('Attributes')),
FieldSet('device_type_id', 'module_type_id', name=_('Device')),
)
type = forms.MultipleChoiceField(
label=_('Type'),
choices=PortTypeChoices,
required=False
)
color = ColorField(
label=_('Color'),
required=False
)
class ModuleBayFilterForm(DeviceComponentFilterForm):
model = ModuleBay
fieldsets = (
@@ -1652,6 +1816,19 @@ class ModuleBayFilterForm(DeviceComponentFilterForm):
)
class ModuleBayTemplateFilterForm(ModularDeviceComponentTemplateFilterForm):
model = ModuleBayTemplate
fieldsets = (
FieldSet('q', 'filter_id', 'tag'),
FieldSet('name', 'label', 'position', name=_('Attributes')),
FieldSet('device_type_id', 'module_type_id', name=_('Device')),
)
position = forms.CharField(
label=_('Position'),
required=False,
)
class DeviceBayFilterForm(DeviceComponentFilterForm):
model = DeviceBay
fieldsets = (
@@ -1666,6 +1843,15 @@ class DeviceBayFilterForm(DeviceComponentFilterForm):
tag = TagFilterField(model)
class DeviceBayTemplateFilterForm(DeviceComponentTemplateFilterForm):
model = DeviceBayTemplate
fieldsets = (
FieldSet('q', 'filter_id', 'tag'),
FieldSet('name', 'label', name=_('Attributes')),
FieldSet('device_type_id', name=_('Device')),
)
class InventoryItemFilterForm(DeviceComponentFilterForm):
model = InventoryItem
fieldsets = (
@@ -1713,6 +1899,25 @@ class InventoryItemFilterForm(DeviceComponentFilterForm):
tag = TagFilterField(model)
class InventoryItemTemplateFilterForm(DeviceComponentTemplateFilterForm):
model = InventoryItemTemplate
fieldsets = (
FieldSet('q', 'filter_id', 'tag'),
FieldSet('name', 'label', 'role_id', 'manufacturer_id', name=_('Attributes')),
FieldSet('device_type_id', name=_('Device')),
)
role_id = DynamicModelMultipleChoiceField(
queryset=InventoryItemRole.objects.all(),
required=False,
label=_('Role')
)
manufacturer_id = DynamicModelMultipleChoiceField(
queryset=Manufacturer.objects.all(),
required=False,
label=_('Manufacturer')
)
#
# Device component roles
#

View File

@@ -115,8 +115,9 @@ class Cable(PrimaryModel):
# A copy of the PK to be used by __str__ in case the object is deleted
self._pk = self.__dict__.get('id')
# Cache the original status so we can check later if it's been changed
# Cache the original profile & status so we can check later whether either has been changed
self._orig_status = self.__dict__.get('status')
self._orig_profile = self.__dict__.get('profile')
self._terminations_modified = False
@@ -290,7 +291,10 @@ class Cable(PrimaryModel):
# Update the private PK used in __str__()
self._pk = self.pk
if self._terminations_modified:
if self._orig_profile != self.profile:
print(f'profile changed from {self._orig_profile} to {self.profile}')
self.update_terminations(force=True)
elif self._terminations_modified:
self.update_terminations()
super().save(*args, force_update=True, using=using, update_fields=update_fields)
@@ -344,24 +348,28 @@ class Cable(PrimaryModel):
return a_terminations, b_terminations
def update_terminations(self):
def update_terminations(self, force=False):
"""
Create/delete CableTerminations for this Cable to reflect its current state.
Args:
force: Force the recreation of all CableTerminations, even if no changes have been made. Needed e.g. when
altering a Cable's assigned profile.
"""
a_terminations, b_terminations = self.get_terminations()
# Delete any stale CableTerminations
for termination, ct in a_terminations.items():
if termination.pk and termination not in self.a_terminations:
if force or (termination.pk and termination not in self.a_terminations):
ct.delete()
for termination, ct in b_terminations.items():
if termination.pk and termination not in self.b_terminations:
if force or (termination.pk and termination not in self.b_terminations):
ct.delete()
# Save any new CableTerminations
profile = self.profile_class() if self.profile else None
for i, termination in enumerate(self.a_terminations, start=1):
if not termination.pk or termination not in a_terminations:
if force or not termination.pk or termination not in a_terminations:
connector = positions = None
if profile:
connector = i
@@ -374,7 +382,7 @@ class Cable(PrimaryModel):
termination=termination
).save()
for i, termination in enumerate(self.b_terminations, start=1):
if not termination.pk or termination not in b_terminations:
if force or not termination.pk or termination not in b_terminations:
connector = positions = None
if profile:
connector = i
@@ -790,7 +798,8 @@ class CablePath(models.Model):
# Profile-based tracing
if links[0].profile:
cable_profile = links[0].profile_class()
term, position = cable_profile.get_peer_termination(terminations[0], position_stack.pop()[0])
position = position_stack.pop()[0] if position_stack else None
term, position = cable_profile.get_peer_termination(terminations[0], position)
remote_terminations = [term]
position_stack.append([position])

View File

@@ -8,6 +8,7 @@ from mptt.models import MPTTModel, TreeForeignKey
from dcim.choices import *
from dcim.constants import *
from dcim.models.base import PortMappingBase
from dcim.utils import resolve_module_placeholders
from dcim.models.mixins import InterfaceValidationMixin
from netbox.models import ChangeLoggedModel
from utilities.fields import ColorField, NaturalOrderingField
@@ -170,27 +171,17 @@ class ModularComponentTemplateModel(ComponentTemplateModel):
return modules
def resolve_name(self, module):
if MODULE_TOKEN not in self.name:
return self.name
"""Resolve {module} and {module_path} placeholders in component name."""
if module:
modules = self._get_module_tree(module)
name = self.name
for module in modules:
name = name.replace(MODULE_TOKEN, module.module_bay.position, 1)
return name
positions = [m.module_bay.position for m in self._get_module_tree(module)]
return resolve_module_placeholders(self.name, positions)
return self.name
def resolve_label(self, module):
if MODULE_TOKEN not in self.label:
return self.label
"""Resolve {module} and {module_path} placeholders in component label."""
if module:
modules = self._get_module_tree(module)
label = self.label
for module in modules:
label = label.replace(MODULE_TOKEN, module.module_bay.position, 1)
return label
positions = [m.module_bay.position for m in self._get_module_tree(module)]
return resolve_module_placeholders(self.label, positions)
return self.label
@@ -721,11 +712,26 @@ class ModuleBayTemplate(ModularComponentTemplateModel):
verbose_name = _('module bay template')
verbose_name_plural = _('module bay templates')
def resolve_position(self, module):
"""
Resolve {module} and {module_path} placeholders in position field.
This allows positions like "{module}/1" to resolve to "A/1" when
the parent module is installed in bay "A".
Fixes Issue #20467.
"""
if module:
positions = [m.module_bay.position for m in self._get_module_tree(module)]
return resolve_module_placeholders(self.position, positions)
return self.position
def instantiate(self, **kwargs):
module = kwargs.get('module')
return self.component_model(
name=self.resolve_name(kwargs.get('module')),
label=self.resolve_label(kwargs.get('module')),
position=self.position,
name=self.resolve_name(module),
label=self.resolve_label(module),
position=self.resolve_position(module),
**kwargs
)
instantiate.do_not_call_in_templates = True

View File

@@ -1292,6 +1292,8 @@ class ModuleBay(ModularComponentModel, TrackingModelMixin, MPTTModel):
def save(self, *args, **kwargs):
if self.module:
self.parent = self.module.module_bay
else:
self.parent = None
super().save(*args, **kwargs)

View File

@@ -959,6 +959,11 @@ class Device(
if cf_defaults := CustomField.objects.get_defaults_for_model(model):
for component in components:
component.custom_field_data = cf_defaults
# Set denormalized references
for component in components:
component._site = self.site
component._location = self.location
component._rack = self.rack
components = model.objects.bulk_create(components)
# Prefetch related objects to minimize queries needed during post_save
prefetch_fields = get_prefetchable_fields(model)

View File

@@ -321,6 +321,12 @@ class Module(TrackingModelMixin, PrimaryModel, ConfigContextModel):
for component in create_instances:
component.custom_field_data = cf_defaults
# Set denormalized references
for component in create_instances:
component._site = self.device.site
component._location = self.device.location
component._rack = self.device.rack
if component_model is not ModuleBay:
component_model.objects.bulk_create(create_instances)
# Emit the post_save signal for each newly created object

View File

@@ -1,15 +1,17 @@
import logging
from django.db.models import Q
from django.db.models.signals import post_save, post_delete
from django.db.models.signals import post_delete, post_save
from django.dispatch import receiver
from dcim.choices import CableEndChoices, LinkStatusChoices
from virtualization.models import VMInterface
from ipam.models import Prefix
from virtualization.models import Cluster, VMInterface
from wireless.models import WirelessLAN
from .models import (
Cable, CablePath, CableTermination, ConsolePort, ConsoleServerPort, Device, DeviceBay, FrontPort, Interface,
InventoryItem, ModuleBay, PathEndpoint, PortMapping, PowerOutlet, PowerPanel, PowerPort, Rack, RearPort, Location,
VirtualChassis,
InventoryItem, Location, ModuleBay, PathEndpoint, PortMapping, PowerOutlet, PowerPanel, PowerPort, Rack, RearPort,
Site, VirtualChassis,
)
from .models.cables import trace_paths
from .utils import create_cablepaths, rebuild_paths
@@ -45,6 +47,9 @@ def handle_location_site_change(instance, created, **kwargs):
Device.objects.filter(location__in=locations).update(site=instance.site)
PowerPanel.objects.filter(location__in=locations).update(site=instance.site)
CableTermination.objects.filter(_location__in=locations).update(_site=instance.site)
# Update component models for devices in these locations
for model in COMPONENT_MODELS:
model.objects.filter(device__location__in=locations).update(_site=instance.site)
@receiver(post_save, sender=Rack)
@@ -54,6 +59,12 @@ def handle_rack_site_change(instance, created, **kwargs):
"""
if not created:
Device.objects.filter(rack=instance).update(site=instance.site, location=instance.location)
# Update component models for devices in this rack
for model in COMPONENT_MODELS:
model.objects.filter(device__rack=instance).update(
_site=instance.site,
_location=instance.location,
)
@receiver(post_save, sender=Device)
@@ -172,3 +183,40 @@ def update_mac_address_interface(instance, created, raw, **kwargs):
if created and not raw and instance.primary_mac_address:
instance.primary_mac_address.assigned_object = instance
instance.primary_mac_address.save()
@receiver(post_save, sender=Location)
@receiver(post_save, sender=Site)
def sync_cached_scope_fields(instance, created, **kwargs):
"""
Rebuild cached scope fields for all CachedScopeMixin-based models
affected by a change in a Region, SiteGroup, Site, or Location.
This method is safe to run for objects created in the past and does
not rely on incremental updates. Cached fields are recomputed from
authoritative relationships.
"""
if created:
return
if isinstance(instance, Location):
filters = {'_location': instance}
elif isinstance(instance, Site):
filters = {'_site': instance}
else:
return
# These models are explicitly listed because they all subclass CachedScopeMixin
# and therefore require their cached scope fields to be recomputed.
for model in (Prefix, Cluster, WirelessLAN):
qs = model.objects.filter(**filters)
for obj in qs:
# Recompute cache using the same logic as save()
obj.cache_related_objects()
obj.save(update_fields=[
'_location',
'_site',
'_site_group',
'_region',
])

View File

@@ -532,7 +532,7 @@ class RackReservationTest(APIViewTestCases.APIViewTestCase):
class ManufacturerTest(APIViewTestCases.APIViewTestCase):
model = Manufacturer
brief_fields = ['description', 'devicetype_count', 'display', 'id', 'name', 'slug', 'url']
brief_fields = ['description', 'display', 'id', 'name', 'slug', 'url']
create_data = [
{
'name': 'Manufacturer 4',

File diff suppressed because it is too large Load Diff

View File

@@ -2362,6 +2362,32 @@ class DeviceTestCase(ViewTestCases.PrimaryObjectViewTestCase):
self.remove_permissions('dcim.view_device')
self.assertHttpStatus(self.client.get(url), 403)
@override_settings(EXEMPT_VIEW_PERMISSIONS=['*'])
def test_bulk_import_duplicate_ids_error_message(self):
device = Device.objects.first()
csv_data = (
"id,role",
f"{device.pk},Device Role 1",
f"{device.pk},Device Role 2",
)
self.add_permissions('dcim.add_device', 'dcim.change_device')
response = self.client.post(
self._get_url('bulk_import'),
{
'data': '\n'.join(csv_data),
'format': ImportFormatChoices.CSV,
'csv_delimiter': CSVDelimiterChoices.AUTO,
},
follow=True
)
self.assertEqual(response.status_code, 200)
self.assertIn(
f'Duplicate objects found: Device with ID(s) {device.pk} appears multiple times',
response.content.decode('utf-8')
)
class ModuleTestCase(
# Module does not support bulk renaming (no name field) or

View File

@@ -4,6 +4,54 @@ from django.apps import apps
from django.contrib.contenttypes.models import ContentType
from django.db import router, transaction
from dcim.constants import MODULE_PATH_TOKEN, MODULE_TOKEN, MODULE_TOKEN_SEPARATOR
def resolve_module_placeholders(text, positions):
"""
Substitute {module} and {module_path} placeholders in text with position values.
Args:
text: String potentially containing {module} or {module_path} placeholders
positions: List of position strings from the module tree (root to leaf)
Returns:
Text with placeholders replaced according to these rules:
{module_path}: Always expands to full path (positions joined by MODULE_TOKEN_SEPARATOR).
Can only appear once in the text.
{module}: If used once, expands to the PARENT module bay position only (last in positions).
If used multiple times, each token is replaced level-by-level.
This design (Option 2 per sigprof's feedback) allows two approaches:
1. Use {module_path} for automatic full-path expansion (hardcodes '/' separator)
2. Use {module} in position fields to build custom paths with user-controlled separators
"""
if not text:
return text
result = text
# Handle {module_path} - always expands to full path
if MODULE_PATH_TOKEN in result:
full_path = MODULE_TOKEN_SEPARATOR.join(positions) if positions else ''
result = result.replace(MODULE_PATH_TOKEN, full_path)
# Handle {module} - parent-only for single token, level-by-level for multiple
if MODULE_TOKEN in result:
token_count = result.count(MODULE_TOKEN)
if token_count == 1 and positions:
# Single {module}: substitute with parent (immediate) bay position only
parent_position = positions[-1] if positions else ''
result = result.replace(MODULE_TOKEN, parent_position, 1)
else:
# Multiple {module}: substitute level-by-level (existing behavior)
for pos in positions:
result = result.replace(MODULE_TOKEN, pos, 1)
return result
def compile_path_node(ct_id, object_id):
return f'{ct_id}:{object_id}'

View File

@@ -16,7 +16,7 @@ from circuits.models import Circuit, CircuitTermination
from dcim.ui import panels
from extras.ui.panels import CustomFieldsPanel, ImageAttachmentsPanel, TagsPanel
from extras.views import ObjectConfigContextView, ObjectRenderConfigView
from ipam.models import ASN, IPAddress, Prefix, VLANGroup, VLAN
from ipam.models import ASN, IPAddress, Prefix, VLAN, VLANGroup
from ipam.tables import InterfaceVLANTable, VLANTranslationRuleTable
from netbox.object_actions import *
from netbox.ui import actions, layout
@@ -2714,11 +2714,12 @@ class DeviceBulkImportView(generic.BulkImportView):
model_form = forms.DeviceImportForm
def save_object(self, object_form, request):
parent_bay = getattr(object_form.instance, 'parent_bay', None)
obj = object_form.save()
# For child devices, save the reverse relation to the parent device bay
if getattr(obj, 'parent_bay', None):
device_bay = obj.parent_bay
if parent_bay:
device_bay = parent_bay
device_bay.installed_device = obj
device_bay.save()
@@ -3159,6 +3160,7 @@ class InterfaceView(generic.ObjectView):
return {
'vdc_table': vdc_table,
'bridge_interfaces': bridge_interfaces,
'bridge_interfaces_table': bridge_interfaces_table,
'child_interfaces_table': child_interfaces_table,
'vlan_table': vlan_table,

View File

@@ -119,7 +119,9 @@ def process_event_rules(event_rules, object_type, event_type, data, username=Non
if snapshots:
params["snapshots"] = snapshots
if request:
params["request"] = copy_safe_request(request)
# Exclude FILES - webhooks don't need uploaded files,
# which can cause pickle errors with Pillow.
params["request"] = copy_safe_request(request, include_files=False)
# Enqueue the task
rq_queue.enqueue(

View File

@@ -189,22 +189,22 @@ class CustomFieldChoiceSetForm(ChangelogMessageMixin, OwnerMixin, forms.ModelFor
# if standardize these, we can simplify this code
# Convert extra_choices Array Field from model to CharField for form
if 'extra_choices' in self.initial and self.initial['extra_choices']:
extra_choices = self.initial['extra_choices']
if extra_choices := self.initial.get('extra_choices', None):
if isinstance(extra_choices, str):
extra_choices = [extra_choices]
choices = ""
choices = []
for choice in extra_choices:
# Setup choices in Add Another use case
if isinstance(choice, str):
choice_str = ":".join(choice.replace("'", "").replace(" ", "")[1:-1].split(","))
choices += choice_str + "\n"
choices.append(choice_str)
# Setup choices in Edit use case
elif isinstance(choice, list):
choice_str = ":".join(choice)
choices += choice_str + "\n"
value = choice[0].replace(':', '\\:')
label = choice[1].replace(':', '\\:')
choices.append(f'{value}:{label}')
self.initial['extra_choices'] = choices
self.initial['extra_choices'] = '\n'.join(choices)
def clean_extra_choices(self):
data = []

View File

@@ -450,7 +450,14 @@ class CustomField(CloningMixin, ExportTemplatesMixin, OwnerMixin, ChangeLoggedMo
return model.objects.filter(pk__in=value)
return value
def to_form_field(self, set_initial=True, enforce_required=True, enforce_visibility=True, for_csv_import=False):
def to_form_field(
self,
set_initial=True,
enforce_required=True,
enforce_visibility=True,
for_csv_import=False,
for_filterset_form=False,
):
"""
Return a form field suitable for setting a CustomField's value for an object.
@@ -458,6 +465,7 @@ class CustomField(CloningMixin, ExportTemplatesMixin, OwnerMixin, ChangeLoggedMo
enforce_required: Honor the value of CustomField.required. Set to False for filtering/bulk editing.
enforce_visibility: Honor the value of CustomField.ui_visible. Set to False for filtering.
for_csv_import: Return a form field suitable for bulk import of objects in CSV format.
for_filterset_form: Return a form field suitable for use in a FilterSet form.
"""
initial = self.default if set_initial else None
required = self.required if enforce_required else False
@@ -520,7 +528,7 @@ class CustomField(CloningMixin, ExportTemplatesMixin, OwnerMixin, ChangeLoggedMo
field_class = CSVMultipleChoiceField
field = field_class(choices=choices, required=required, initial=initial)
else:
if self.type == CustomFieldTypeChoices.TYPE_SELECT:
if self.type == CustomFieldTypeChoices.TYPE_SELECT and not for_filterset_form:
field_class = DynamicChoiceField
widget_class = APISelect
else:
@@ -871,6 +879,16 @@ class CustomFieldChoiceSet(CloningMixin, ExportTemplatesMixin, OwnerMixin, Chang
if not self.base_choices and not self.extra_choices:
raise ValidationError(_("Must define base or extra choices."))
# Check for duplicate values in extra_choices
choice_values = [c[0] for c in self.extra_choices] if self.extra_choices else []
if len(set(choice_values)) != len(choice_values):
# At least one duplicate value is present. Find the first one and raise an error.
_seen = []
for value in choice_values:
if value in _seen:
raise ValidationError(_("Duplicate value '{value}' found in extra choices.").format(value=value))
_seen.append(value)
# Check whether any choices have been removed. If so, check whether any of the removed
# choices are still set in custom field data for any object.
original_choices = set([

View File

@@ -1506,19 +1506,18 @@ class CustomFieldModelTest(TestCase):
def test_invalid_data(self):
"""
Setting custom field data for a non-applicable (or non-existent) CustomField should raise a ValidationError.
Any invalid or stale custom field data should be removed from the instance.
"""
site = Site(name='Test Site', slug='test-site')
# Set custom field data
site.custom_field_data['foo'] = 'abc'
site.custom_field_data['bar'] = 'def'
with self.assertRaises(ValidationError):
site.clean()
del site.custom_field_data['bar']
site.clean()
self.assertIn('foo', site.custom_field_data)
self.assertNotIn('bar', site.custom_field_data)
def test_missing_required_field(self):
"""
Check that a ValidationError is raised if any required custom fields are not present.

View File

@@ -5,6 +5,7 @@ from dcim.forms import SiteForm
from dcim.models import Site
from extras.choices import CustomFieldTypeChoices
from extras.forms import SavedFilterForm
from extras.forms.model_forms import CustomFieldChoiceSetForm
from extras.models import CustomField, CustomFieldChoiceSet
@@ -90,6 +91,31 @@ class CustomFieldModelFormTest(TestCase):
self.assertIsNone(instance.custom_field_data[field_type])
class CustomFieldChoiceSetFormTest(TestCase):
def test_escaped_colons_preserved_on_edit(self):
choice_set = CustomFieldChoiceSet.objects.create(
name='Test Choice Set',
extra_choices=[['foo:bar', 'label'], ['value', 'label:with:colons']]
)
form = CustomFieldChoiceSetForm(instance=choice_set)
initial_choices = form.initial['extra_choices']
# colons are re-escaped
self.assertEqual(initial_choices, 'foo\\:bar:label\nvalue:label\\:with\\:colons')
form = CustomFieldChoiceSetForm(
{'name': choice_set.name, 'extra_choices': initial_choices},
instance=choice_set
)
self.assertTrue(form.is_valid())
updated = form.save()
# cleaned extra choices are correct, which does actually mean a list of tuples
self.assertEqual(updated.extra_choices, [('foo:bar', 'label'), ('value', 'label:with:colons')])
class SavedFilterFormTest(TestCase):
def test_basic_submit(self):

View File

@@ -51,7 +51,14 @@ class ImageAttachmentsPanel(panels.ObjectsTablePanel):
]
def __init__(self, **kwargs):
super().__init__('extras.imageattachment', **kwargs)
super().__init__(
'extras.imageattachment',
filters={
'object_type_id': lambda ctx: ContentType.objects.get_for_model(ctx['object']).pk,
'object_id': lambda ctx: ctx['object'].pk,
},
**kwargs,
)
class TagsPanel(panels.ObjectPanel):

View File

@@ -59,24 +59,18 @@ class PrefixSerializer(PrimaryModelSerializer):
vlan = VLANSerializer(nested=True, required=False, allow_null=True)
status = ChoiceField(choices=PrefixStatusChoices, required=False)
role = RoleSerializer(nested=True, required=False, allow_null=True)
_children = serializers.IntegerField(read_only=True)
children = serializers.IntegerField(read_only=True)
_depth = serializers.IntegerField(read_only=True)
prefix = IPNetworkField()
class Meta:
model = Prefix
fields = [
'id', 'url', 'display_url', 'display', 'family', 'aggregate', 'parent', 'prefix', 'vrf', 'scope_type',
'scope_id', 'scope', 'tenant', 'vlan', 'status', 'role', 'is_pool', 'mark_utilized', 'description',
'owner', 'comments', 'tags', 'custom_fields', 'created', 'last_updated', '_children', '_depth',
'id', 'url', 'display_url', 'display', 'family', 'prefix', 'vrf', 'scope_type', 'scope_id', 'scope',
'tenant', 'vlan', 'status', 'role', 'is_pool', 'mark_utilized', 'description', 'owner', 'comments', 'tags',
'custom_fields', 'created', 'last_updated', 'children', '_depth',
]
brief_fields = ('id', 'url', 'display', 'family', 'aggregate', 'parent', 'prefix', 'description', '_depth')
def get_fields(self):
fields = super(PrefixSerializer, self).get_fields()
fields['parent'] = PrefixSerializer(nested=True, read_only=True)
return fields
brief_fields = ('id', 'url', 'display', 'family', 'prefix', 'description', '_depth')
class PrefixLengthSerializer(serializers.Serializer):
@@ -130,9 +124,7 @@ class AvailablePrefixSerializer(serializers.Serializer):
# IP ranges
#
class IPRangeSerializer(PrimaryModelSerializer):
prefix = PrefixSerializer(nested=True, required=False, allow_null=True)
family = ChoiceField(choices=IPAddressFamilyChoices, read_only=True)
start_address = IPAddressField()
end_address = IPAddressField()
@@ -144,11 +136,11 @@ class IPRangeSerializer(PrimaryModelSerializer):
class Meta:
model = IPRange
fields = [
'id', 'url', 'display_url', 'display', 'family', 'prefix', 'start_address', 'end_address', 'size', 'vrf',
'tenant', 'status', 'role', 'description', 'owner', 'comments', 'tags', 'custom_fields', 'created',
'last_updated', 'mark_populated', 'mark_utilized',
'id', 'url', 'display_url', 'display', 'family', 'start_address', 'end_address', 'size', 'vrf', 'tenant',
'status', 'role', 'description', 'owner', 'comments', 'tags', 'custom_fields', 'created', 'last_updated',
'mark_populated', 'mark_utilized',
]
brief_fields = ('id', 'url', 'display', 'family', 'prefix', 'start_address', 'end_address', 'description')
brief_fields = ('id', 'url', 'display', 'family', 'start_address', 'end_address', 'description')
#
@@ -156,7 +148,6 @@ class IPRangeSerializer(PrimaryModelSerializer):
#
class IPAddressSerializer(PrimaryModelSerializer):
prefix = PrefixSerializer(nested=True, required=False, allow_null=True)
family = ChoiceField(choices=IPAddressFamilyChoices, read_only=True)
address = IPAddressField()
vrf = VRFSerializer(nested=True, required=False, allow_null=True)
@@ -175,11 +166,11 @@ class IPAddressSerializer(PrimaryModelSerializer):
class Meta:
model = IPAddress
fields = [
'id', 'url', 'display_url', 'display', 'family', 'prefix', 'address', 'vrf', 'tenant', 'status', 'role',
'id', 'url', 'display_url', 'display', 'family', 'address', 'vrf', 'tenant', 'status', 'role',
'assigned_object_type', 'assigned_object_id', 'assigned_object', 'nat_inside', 'nat_outside',
'dns_name', 'description', 'owner', 'comments', 'tags', 'custom_fields', 'created', 'last_updated',
]
brief_fields = ('id', 'url', 'display', 'family', 'prefix', 'address', 'description')
brief_fields = ('id', 'url', 'display', 'family', 'address', 'description')
class AvailableIPSerializer(serializers.Serializer):

View File

@@ -16,6 +16,7 @@ __all__ = (
# BGP ASN bounds
BGP_ASN_MIN = 1
BGP_ASN_MAX = 2**32 - 1
BGP_ASN_ASDOT_BASE = 2**16
class BaseIPField(models.Field):
@@ -126,3 +127,16 @@ class ASNField(models.BigIntegerField):
}
defaults.update(**kwargs)
return super().formfield(**defaults)
@staticmethod
def to_asdot(value) -> str:
"""
Return ASDOT notation for AS numbers greater than 16 bits.
"""
if value is None:
return ''
if value >= BGP_ASN_ASDOT_BASE:
hi, lo = divmod(value, BGP_ASN_ASDOT_BASE)
return f'{hi}.{lo}'
return str(value)

View File

@@ -340,26 +340,6 @@ class PrefixFilterSet(PrimaryModelFilterSet, ScopedFilterSet, TenancyFilterSet,
field_name='prefix',
lookup_expr='net_mask_length__lte'
)
aggregate_id = django_filters.ModelMultipleChoiceFilter(
queryset=Aggregate.objects.all(),
label=_('Aggregate'),
)
aggregate = django_filters.ModelMultipleChoiceFilter(
field_name='aggregate__prefix',
queryset=Aggregate.objects.all(),
to_field_name='prefix',
label=_('Aggregate (Prefix)'),
)
parent_id = django_filters.ModelMultipleChoiceFilter(
queryset=Prefix.objects.all(),
label=_('Parent Prefix'),
)
parent = django_filters.ModelMultipleChoiceFilter(
field_name='parent__prefix',
queryset=Prefix.objects.all(),
to_field_name='prefix',
label=_('Parent Prefix (Prefix)'),
)
vrf_id = django_filters.ModelMultipleChoiceFilter(
queryset=VRF.objects.all(),
label=_('VRF'),
@@ -504,16 +484,6 @@ class IPRangeFilterSet(PrimaryModelFilterSet, TenancyFilterSet, ContactModelFilt
method='search_contains',
label=_('Ranges which contain this prefix or IP'),
)
prefix_id = django_filters.ModelMultipleChoiceFilter(
queryset=Prefix.objects.all(),
label=_('Prefix (ID)'),
)
prefix = django_filters.ModelMultipleChoiceFilter(
field_name='prefix__prefix',
queryset=Prefix.objects.all(),
to_field_name='prefix',
label=_('Prefix'),
)
vrf_id = django_filters.ModelMultipleChoiceFilter(
queryset=VRF.objects.all(),
label=_('VRF'),
@@ -599,16 +569,6 @@ class IPAddressFilterSet(PrimaryModelFilterSet, TenancyFilterSet, ContactModelFi
method='search_by_parent',
label=_('Parent prefix'),
)
prefix_id = django_filters.ModelMultipleChoiceFilter(
queryset=Prefix.objects.all(),
label=_('Prefix (ID)'),
)
prefix = django_filters.ModelMultipleChoiceFilter(
field_name='prefix__prefix',
queryset=Prefix.objects.all(),
to_field_name='prefix',
label=_('Prefix (prefix)'),
)
address = MultiValueCharFilter(
method='filter_address',
label=_('Address'),

View File

@@ -168,11 +168,6 @@ class RoleBulkEditForm(OrganizationalModelBulkEditForm):
class PrefixBulkEditForm(ScopedBulkEditForm, PrimaryModelBulkEditForm):
parent = DynamicModelChoiceField(
queryset=Prefix.objects.all(),
required=False,
label=_('Parent Prefix')
)
vlan_group = DynamicModelChoiceField(
queryset=VLANGroup.objects.all(),
required=False,
@@ -226,7 +221,7 @@ class PrefixBulkEditForm(ScopedBulkEditForm, PrimaryModelBulkEditForm):
model = Prefix
fieldsets = (
FieldSet('tenant', 'status', 'role', 'description'),
FieldSet('parent', 'vrf', 'prefix_length', 'is_pool', 'mark_utilized', name=_('Addressing')),
FieldSet('vrf', 'prefix_length', 'is_pool', 'mark_utilized', name=_('Addressing')),
FieldSet('scope_type', 'scope', name=_('Scope')),
FieldSet('vlan_group', 'vlan', name=_('VLAN Assignment')),
)
@@ -236,11 +231,6 @@ class PrefixBulkEditForm(ScopedBulkEditForm, PrimaryModelBulkEditForm):
class IPRangeBulkEditForm(PrimaryModelBulkEditForm):
prefix = DynamicModelChoiceField(
queryset=Prefix.objects.all(),
required=False,
label=_('Prefix')
)
vrf = DynamicModelChoiceField(
queryset=VRF.objects.all(),
required=False,
@@ -282,16 +272,6 @@ class IPRangeBulkEditForm(PrimaryModelBulkEditForm):
class IPAddressBulkEditForm(PrimaryModelBulkEditForm):
prefix = DynamicModelChoiceField(
queryset=Prefix.objects.all(),
required=False,
label=_('Prefix')
)
prefix = DynamicModelChoiceField(
queryset=Prefix.objects.all(),
required=False,
label=_('Prefix')
)
vrf = DynamicModelChoiceField(
queryset=VRF.objects.all(),
required=False,
@@ -327,10 +307,10 @@ class IPAddressBulkEditForm(PrimaryModelBulkEditForm):
model = IPAddress
fieldsets = (
FieldSet('status', 'role', 'tenant', 'description'),
FieldSet('prefix', 'vrf', 'mask_length', 'dns_name', name=_('Addressing')),
FieldSet('vrf', 'mask_length', 'dns_name', name=_('Addressing')),
)
nullable_fields = (
'prefix', 'vrf', 'role', 'tenant', 'dns_name', 'description', 'comments',
'vrf', 'role', 'tenant', 'dns_name', 'description', 'comments',
)

View File

@@ -155,18 +155,6 @@ class RoleImportForm(OrganizationalModelImportForm):
class PrefixImportForm(ScopedImportForm, PrimaryModelImportForm):
aggregate = CSVModelChoiceField(
label=_('Aggregate'),
queryset=Aggregate.objects.all(),
to_field_name='prefix',
required=False
)
parent = CSVModelChoiceField(
label=_('Prefix'),
queryset=Prefix.objects.all(),
to_field_name='prefix',
required=False
)
vrf = CSVModelChoiceField(
label=_('VRF'),
queryset=VRF.objects.all(),
@@ -241,10 +229,6 @@ class PrefixImportForm(ScopedImportForm, PrimaryModelImportForm):
query |= Q(**{
f"site__{self.fields['vlan_site'].to_field_name}": vlan_site
})
# Don't Forget to include VLANs without a site in the filter
query |= Q(**{
f"site__{self.fields['vlan_site'].to_field_name}__isnull": True
})
if vlan_group:
query &= Q(**{
@@ -254,26 +238,8 @@ class PrefixImportForm(ScopedImportForm, PrimaryModelImportForm):
queryset = self.fields['vlan'].queryset.filter(query)
self.fields['vlan'].queryset = queryset
# Limit Prefix queryset by assigned vrf
vrf = data.get('vrf')
query = Q()
if vrf:
query &= Q(**{
f"vrf__{self.fields['vrf'].to_field_name}": vrf
})
queryset = self.fields['parent'].queryset.filter(query)
self.fields['parent'].queryset = queryset
class IPRangeImportForm(PrimaryModelImportForm):
prefix = CSVModelChoiceField(
label=_('Prefix'),
queryset=Prefix.objects.all(),
to_field_name='prefix',
required=True,
help_text=_('Assigned prefix')
)
vrf = CSVModelChoiceField(
label=_('VRF'),
queryset=VRF.objects.all(),
@@ -308,29 +274,8 @@ class IPRangeImportForm(PrimaryModelImportForm):
'description', 'owner', 'comments', 'tags',
)
def __init__(self, data=None, *args, **kwargs):
super().__init__(data, *args, **kwargs)
# Limit Prefix queryset by assigned vrf
vrf = data.get('vrf')
query = Q()
if vrf:
query &= Q(**{
f"vrf__{self.fields['vrf'].to_field_name}": vrf
})
queryset = self.fields['prefix'].queryset.filter(query)
self.fields['prefix'].queryset = queryset
class IPAddressImportForm(PrimaryModelImportForm):
prefix = CSVModelChoiceField(
label=_('Prefix'),
queryset=Prefix.objects.all(),
required=False,
to_field_name='prefix',
help_text=_('Assigned prefix')
)
vrf = CSVModelChoiceField(
label=_('VRF'),
queryset=VRF.objects.all(),
@@ -398,8 +343,8 @@ class IPAddressImportForm(PrimaryModelImportForm):
class Meta:
model = IPAddress
fields = [
'prefix', 'address', 'vrf', 'tenant', 'status', 'role', 'device', 'virtual_machine', 'interface',
'fhrp_group', 'is_primary', 'is_oob', 'dns_name', 'owner', 'description', 'comments', 'tags',
'address', 'vrf', 'tenant', 'status', 'role', 'device', 'virtual_machine', 'interface', 'fhrp_group',
'is_primary', 'is_oob', 'dns_name', 'description', 'owner', 'comments', 'tags',
]
def __init__(self, data=None, *args, **kwargs):
@@ -407,15 +352,6 @@ class IPAddressImportForm(PrimaryModelImportForm):
if data:
# Limit Prefix queryset by assigned vrf
vrf = data.get('vrf')
query = Q()
if vrf:
query &= Q(**{f"vrf__{self.fields['vrf'].to_field_name}": vrf})
queryset = self.fields['prefix'].queryset.filter(query)
self.fields['prefix'].queryset = queryset
# Limit interface queryset by assigned device
if data.get('device'):
self.fields['interface'].queryset = Interface.objects.filter(

View File

@@ -211,12 +211,6 @@ class PrefixFilterForm(ContactModelFilterForm, TenancyFilterForm, PrimaryModelFi
choices=PREFIX_MASK_LENGTH_CHOICES,
label=_('Mask length')
)
aggregate_id = DynamicModelMultipleChoiceField(
queryset=Aggregate.objects.all(),
required=False,
label=_('Aggregate'),
null_option='Global'
)
vrf_id = DynamicModelMultipleChoiceField(
queryset=VRF.objects.all(),
required=False,
@@ -291,18 +285,10 @@ class IPRangeFilterForm(ContactModelFilterForm, TenancyFilterForm, PrimaryModelF
model = IPRange
fieldsets = (
FieldSet('q', 'filter_id', 'tag', 'owner_id'),
FieldSet(
'prefix', 'family', 'vrf_id', 'status', 'role_id', 'mark_populated', 'mark_utilized', name=_('Attributes')
),
FieldSet('family', 'vrf_id', 'status', 'role_id', 'mark_populated', 'mark_utilized', name=_('Attributes')),
FieldSet('tenant_group_id', 'tenant_id', name=_('Tenant')),
FieldSet('contact', 'contact_role', 'contact_group', name=_('Contacts')),
)
prefix = DynamicModelMultipleChoiceField(
queryset=Prefix.objects.all(),
required=False,
label=_('Prefix'),
null_option='None'
)
family = forms.ChoiceField(
required=False,
choices=add_blank_choice(IPAddressFamilyChoices),
@@ -347,7 +333,7 @@ class IPAddressFilterForm(ContactModelFilterForm, TenancyFilterForm, PrimaryMode
fieldsets = (
FieldSet('q', 'filter_id', 'tag', 'owner_id'),
FieldSet(
'prefix', 'parent', 'family', 'status', 'role', 'mask_length', 'assigned_to_interface', 'dns_name',
'parent', 'family', 'status', 'role', 'mask_length', 'assigned_to_interface', 'dns_name',
name=_('Attributes')
),
FieldSet('vrf_id', 'present_in_vrf_id', name=_('VRF')),
@@ -355,7 +341,7 @@ class IPAddressFilterForm(ContactModelFilterForm, TenancyFilterForm, PrimaryMode
FieldSet('device_id', 'virtual_machine_id', name=_('Device/VM')),
FieldSet('contact', 'contact_role', 'contact_group', name=_('Contacts')),
)
selector_fields = ('filter_id', 'q', 'region_id', 'group_id', 'prefix_id', 'parent', 'status', 'role')
selector_fields = ('filter_id', 'q', 'region_id', 'group_id', 'parent', 'status', 'role')
parent = forms.CharField(
required=False,
widget=forms.TextInput(
@@ -375,11 +361,6 @@ class IPAddressFilterForm(ContactModelFilterForm, TenancyFilterForm, PrimaryMode
choices=IPADDRESS_MASK_LENGTH_CHOICES,
label=_('Mask length')
)
prefix_id = DynamicModelMultipleChoiceField(
queryset=Prefix.objects.all(),
required=False,
label=_('Prefix'),
)
vrf_id = DynamicModelMultipleChoiceField(
queryset=VRF.objects.all(),
required=False,

View File

@@ -241,11 +241,6 @@ class PrefixForm(TenancyForm, ScopedForm, PrimaryModelForm):
class IPRangeForm(TenancyForm, PrimaryModelForm):
prefix = DynamicModelChoiceField(
queryset=Prefix.objects.all(),
required=False,
label=_('Prefix')
)
vrf = DynamicModelChoiceField(
queryset=VRF.objects.all(),
required=False,
@@ -260,8 +255,8 @@ class IPRangeForm(TenancyForm, PrimaryModelForm):
fieldsets = (
FieldSet(
'prefix', 'vrf', 'start_address', 'end_address', 'role', 'status', 'mark_populated', 'mark_utilized',
'description', 'tags', name=_('IP Range')
'vrf', 'start_address', 'end_address', 'role', 'status', 'mark_populated', 'mark_utilized', 'description',
'tags', name=_('IP Range')
),
FieldSet('tenant_group', 'tenant', name=_('Tenancy')),
)
@@ -269,21 +264,12 @@ class IPRangeForm(TenancyForm, PrimaryModelForm):
class Meta:
model = IPRange
fields = [
'prefix', 'vrf', 'start_address', 'end_address', 'status', 'role', 'tenant_group', 'tenant',
'mark_populated', 'mark_utilized', 'description', 'owner', 'comments', 'tags',
'vrf', 'start_address', 'end_address', 'status', 'role', 'tenant_group', 'tenant', 'mark_populated',
'mark_utilized', 'description', 'owner', 'comments', 'tags',
]
class IPAddressForm(TenancyForm, PrimaryModelForm):
prefix = DynamicModelChoiceField(
queryset=Prefix.objects.all(),
required=False,
context={
'vrf': 'vrf',
},
selector=True,
label=_('Prefix'),
)
interface = DynamicModelChoiceField(
queryset=Interface.objects.all(),
required=False,
@@ -329,7 +315,7 @@ class IPAddressForm(TenancyForm, PrimaryModelForm):
)
fieldsets = (
FieldSet('prefix', 'address', 'status', 'role', 'vrf', 'dns_name', 'description', 'tags', name=_('IP Address')),
FieldSet('address', 'status', 'role', 'vrf', 'dns_name', 'description', 'tags', name=_('IP Address')),
FieldSet('tenant_group', 'tenant', name=_('Tenancy')),
FieldSet(
TabbedGroups(
@@ -345,8 +331,8 @@ class IPAddressForm(TenancyForm, PrimaryModelForm):
class Meta:
model = IPAddress
fields = [
'prefix', 'address', 'vrf', 'status', 'role', 'dns_name', 'primary_for_parent', 'oob_for_parent',
'nat_inside', 'tenant_group', 'tenant', 'description', 'owner', 'comments', 'tags',
'address', 'vrf', 'status', 'role', 'dns_name', 'primary_for_parent', 'oob_for_parent', 'nat_inside',
'tenant_group', 'tenant', 'description', 'owner', 'comments', 'tags',
]
def __init__(self, *args, **kwargs):
@@ -471,15 +457,6 @@ class IPAddressForm(TenancyForm, PrimaryModelForm):
class IPAddressBulkAddForm(TenancyForm, NetBoxModelForm):
prefix = DynamicModelChoiceField(
queryset=Prefix.objects.all(),
required=False,
context={
'vrf': 'vrf',
},
selector=True,
label=_('Prefix'),
)
vrf = DynamicModelChoiceField(
queryset=VRF.objects.all(),
required=False,
@@ -489,7 +466,7 @@ class IPAddressBulkAddForm(TenancyForm, NetBoxModelForm):
class Meta:
model = IPAddress
fields = [
'address', 'prefix', 'vrf', 'status', 'role', 'dns_name', 'description', 'tenant_group', 'tenant', 'tags',
'address', 'vrf', 'status', 'role', 'dns_name', 'description', 'tenant_group', 'tenant', 'tags',
]

View File

@@ -170,7 +170,6 @@ class FHRPGroupAssignmentFilter(ChangeLoggedModelFilter):
@strawberry_django.filter_type(models.IPAddress, lookups=True)
class IPAddressFilter(ContactFilterMixin, TenancyFilterMixin, PrimaryModelFilter):
prefix: Annotated['PrefixFilter', strawberry.lazy('ipam.graphql.filters')] | None = strawberry_django.filter_field()
address: FilterLookup[str] | None = strawberry_django.filter_field()
vrf: Annotated['VRFFilter', strawberry.lazy('ipam.graphql.filters')] | None = strawberry_django.filter_field()
vrf_id: ID | None = strawberry_django.filter_field()
@@ -222,7 +221,6 @@ class IPAddressFilter(ContactFilterMixin, TenancyFilterMixin, PrimaryModelFilter
@strawberry_django.filter_type(models.IPRange, lookups=True)
class IPRangeFilter(ContactFilterMixin, TenancyFilterMixin, PrimaryModelFilter):
prefix: Annotated['PrefixFilter', strawberry.lazy('ipam.graphql.filters')] | None = strawberry_django.filter_field()
start_address: FilterLookup[str] | None = strawberry_django.filter_field()
end_address: FilterLookup[str] | None = strawberry_django.filter_field()
size: Annotated['IntegerLookup', strawberry.lazy('netbox.graphql.filter_lookups')] | None = (
@@ -277,10 +275,6 @@ class IPRangeFilter(ContactFilterMixin, TenancyFilterMixin, PrimaryModelFilter):
@strawberry_django.filter_type(models.Prefix, lookups=True)
class PrefixFilter(ContactFilterMixin, ScopedFilterMixin, TenancyFilterMixin, PrimaryModelFilter):
aggregate: Annotated['AggregateFilter', strawberry.lazy('ipam.graphql.filters')] | None = (
strawberry_django.filter_field()
)
parent: Annotated['PrefixFilter', strawberry.lazy('ipam.graphql.filters')] | None = strawberry_django.filter_field()
prefix: FilterLookup[str] | None = strawberry_django.filter_field()
vrf: Annotated['VRFFilter', strawberry.lazy('ipam.graphql.filters')] | None = strawberry_django.filter_field()
vrf_id: ID | None = strawberry_django.filter_field()

View File

@@ -143,7 +143,6 @@ class FHRPGroupAssignmentType(BaseObjectType):
)
class IPAddressType(ContactsMixin, BaseIPAddressFamilyType, PrimaryObjectType):
address: str
prefix: Annotated["PrefixType", strawberry.lazy('ipam.graphql.types')] | None
vrf: Annotated["VRFType", strawberry.lazy('ipam.graphql.types')] | None
tenant: Annotated["TenantType", strawberry.lazy('tenancy.graphql.types')] | None
nat_inside: Annotated["IPAddressType", strawberry.lazy('ipam.graphql.types')] | None
@@ -168,7 +167,6 @@ class IPAddressType(ContactsMixin, BaseIPAddressFamilyType, PrimaryObjectType):
pagination=True
)
class IPRangeType(ContactsMixin, PrimaryObjectType):
prefix: Annotated["PrefixType", strawberry.lazy('ipam.graphql.types')] | None
start_address: str
end_address: str
vrf: Annotated["VRFType", strawberry.lazy('ipam.graphql.types')] | None
@@ -183,8 +181,6 @@ class IPRangeType(ContactsMixin, PrimaryObjectType):
pagination=True
)
class PrefixType(ContactsMixin, BaseIPAddressFamilyType, PrimaryObjectType):
aggregate: Annotated["AggregateType", strawberry.lazy('ipam.graphql.types')] | None
parent: Annotated["PrefixType", strawberry.lazy('ipam.graphql.types')] | None
prefix: str
vrf: Annotated["VRFType", strawberry.lazy('ipam.graphql.types')] | None
tenant: Annotated["TenantType", strawberry.lazy('tenancy.graphql.types')] | None

View File

@@ -1,58 +0,0 @@
# Generated by Django 5.0.9 on 2025-02-20 16:49
import django.db.models.deletion
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('ipam', '0086_gfk_indexes'),
]
operations = [
migrations.AddField(
model_name='prefix',
name='parent',
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='children',
to='ipam.prefix',
),
),
migrations.AddField(
model_name='ipaddress',
name='prefix',
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='ip_addresses',
to='ipam.prefix',
),
),
migrations.AddField(
model_name='iprange',
name='prefix',
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='ip_ranges',
to='ipam.prefix',
),
),
migrations.AddField(
model_name='prefix',
name='aggregate',
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='prefixes',
to='ipam.aggregate',
),
),
]

View File

@@ -1,132 +0,0 @@
# Generated by Django 5.0.9 on 2025-02-20 16:49
import sys
import time
from django.db import migrations, models
from ipam.choices import PrefixStatusChoices
def draw_progress(count, total, length=20):
if total == 0:
return
progress = count / total
percent = int(progress * 100)
bar = int(progress * length)
sys.stdout.write('\r')
sys.stdout.write(f"[{'=' * bar:{length}s}] {percent}%")
sys.stdout.flush()
def set_prefix(apps, schema_editor, model, attr='address', parent_attr='prefix', parent_model='Prefix'):
start = time.time()
ChildModel = apps.get_model('ipam', model)
ParentModel = apps.get_model('ipam', parent_model)
addresses = ChildModel.objects.all()
total = addresses.count()
if total == 0:
return
print('\r\n')
print(f'Migrating {parent_model}')
print('\r\n')
i = 0
draw_progress(i, total, 50)
for address in addresses:
i += 1
address_attr = getattr(address, attr)
prefixes = ParentModel.objects.filter(
prefix__net_contains_or_equals=str(address_attr.ip),
prefix__net_mask_length__lte=address_attr.prefixlen,
)
setattr(address, parent_attr, prefixes.last())
try:
address.save()
except Exception as e:
print(f'Error at {address}')
raise e
draw_progress(i, total, 50)
end = time.time()
print(f"\r\nElapsed Time: {end - start:.2f}s")
def set_ipaddress_prefix(apps, schema_editor):
set_prefix(apps, schema_editor, 'IPAddress')
def unset_ipaddress_prefix(apps, schema_editor):
IPAddress = apps.get_model('ipam', 'IPAddress')
IPAddress.objects.update(prefix=None)
def set_iprange_prefix(apps, schema_editor):
set_prefix(apps, schema_editor, 'IPRange', 'start_address')
def unset_iprange_prefix(apps, schema_editor):
IPRange = apps.get_model('ipam', 'IPRange')
IPRange.objects.update(prefix=None)
def set_prefix_aggregate(apps, schema_editor):
set_prefix(apps, schema_editor, 'Prefix', 'prefix', 'aggregate', 'Aggregate')
def unset_prefix_aggregate(apps, schema_editor):
Prefix = apps.get_model('ipam', 'Prefix')
Prefix.objects.update(aggregate=None)
def set_prefix_parent(apps, schema_editor):
Prefix = apps.get_model('ipam', 'Prefix')
start = time.time()
addresses = Prefix.objects.all()
i = 0
total = addresses.count()
if total == 0:
return
print('\r\n')
draw_progress(i, total, 50)
for address in addresses:
i += 1
prefixes = Prefix.objects.exclude(pk=address.pk).filter(
models.Q(vrf=address.vrf, prefix__net_contains=str(address.prefix.ip))
| models.Q(
vrf=None,
status=PrefixStatusChoices.STATUS_CONTAINER,
prefix__net_contains=str(address.prefix.ip),
)
)
if not prefixes.exists():
draw_progress(i, total, 50)
continue
address.parent = prefixes.last()
address.save()
draw_progress(i, total, 50)
end = time.time()
print(f"\r\nElapsed Time: {end - start:.2f}s")
def unset_prefix_parent(apps, schema_editor):
Prefix = apps.get_model('ipam', 'Prefix')
Prefix.objects.update(parent=None)
class Migration(migrations.Migration):
dependencies = [
('ipam', '0087_ipaddress_iprange_prefix_parent'),
]
operations = [
migrations.RunPython(set_ipaddress_prefix, unset_ipaddress_prefix),
migrations.RunPython(set_iprange_prefix, unset_iprange_prefix),
migrations.RunPython(set_prefix_aggregate, unset_prefix_aggregate),
migrations.RunPython(set_prefix_parent, unset_prefix_parent),
]

View File

@@ -1,25 +0,0 @@
# Generated by Django 5.2.5 on 2025-11-25 03:53
import django.db.models.deletion
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('ipam', '0089_prefix_ipam_prefix_delete_prefix_ipam_prefix_insert'),
]
operations = [
migrations.AlterField(
model_name='prefix',
name='parent',
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.DO_NOTHING,
related_name='children',
to='ipam.prefix',
),
),
]

View File

@@ -1,43 +0,0 @@
# Generated by Django 5.2.5 on 2025-11-06 03:24
import pgtrigger.compiler
import pgtrigger.migrations
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('ipam', '0088_ipaddress_iprange_prefix_parent_data'),
]
operations = [
pgtrigger.migrations.AddTrigger(
model_name='prefix',
trigger=pgtrigger.compiler.Trigger(
name='ipam_prefix_delete',
sql=pgtrigger.compiler.UpsertTriggerSql(
func="\n-- Update Child Prefix's with Prefix's PARENT\nUPDATE ipam_prefix SET parent_id=OLD.parent_id WHERE parent_id=OLD.id;\nRETURN OLD;\n", # noqa: E501
hash='899e1943cb201118be7ef02f36f49747224774f2',
operation='DELETE',
pgid='pgtrigger_ipam_prefix_delete_e7810',
table='ipam_prefix',
when='BEFORE',
),
),
),
pgtrigger.migrations.AddTrigger(
model_name='prefix',
trigger=pgtrigger.compiler.Trigger(
name='ipam_prefix_insert',
sql=pgtrigger.compiler.UpsertTriggerSql(
func="\nUPDATE ipam_prefix\nSET parent_id=NEW.id \nWHERE \n prefix << NEW.prefix\n AND\n (\n (vrf_id = NEW.vrf_id OR (vrf_id IS NULL AND NEW.vrf_id IS NULL))\n OR\n (\n NEW.vrf_id IS NULL\n AND\n NEW.status = 'container'\n AND\n NOT EXISTS(\n SELECT 1 FROM ipam_prefix p WHERE p.prefix >> ipam_prefix.prefix AND p.vrf_id = ipam_prefix.vrf_id\n )\n )\n )\n AND id != NEW.id\n AND NOT EXISTS (\n SELECT 1 FROM ipam_prefix p\n WHERE\n p.prefix >> ipam_prefix.prefix\n AND p.prefix << NEW.prefix\n AND (\n (p.vrf_id = ipam_prefix.vrf_id OR (p.vrf_id IS NULL AND ipam_prefix.vrf_id IS NULL))\n OR\n (p.vrf_id IS NULL AND p.status = 'container')\n )\n AND p.id != NEW.id\n )\n;\nRETURN NEW;\n", # noqa: E501
hash='0e05bbe61861227a9eb710b6c94bae9e0cc7119e',
operation='INSERT',
pgid='pgtrigger_ipam_prefix_insert_46c72',
table='ipam_prefix',
when='AFTER',
),
),
),
]

View File

@@ -1,65 +0,0 @@
# Generated by Django 5.2.5 on 2025-11-25 06:00
import pgtrigger.compiler
import pgtrigger.migrations
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('ipam', '0089_alter_prefix_parent'),
]
operations = [
pgtrigger.migrations.RemoveTrigger(
model_name='prefix',
name='ipam_prefix_delete',
),
pgtrigger.migrations.RemoveTrigger(
model_name='prefix',
name='ipam_prefix_insert',
),
pgtrigger.migrations.AddTrigger(
model_name='prefix',
trigger=pgtrigger.compiler.Trigger(
name='ipam_prefix_delete',
sql=pgtrigger.compiler.UpsertTriggerSql(
func="\n-- Update Child Prefix's with Prefix's PARENT This is a safe assumption based on the fact that the parent would be the\n-- next direct parent for anything else that could contain this prefix\nUPDATE ipam_prefix SET parent_id=OLD.parent_id WHERE parent_id=OLD.id;\nRETURN OLD;\n", # noqa: E501
hash='ee3f890009c05a3617428158e7b6f3d77317885d',
operation='DELETE',
pgid='pgtrigger_ipam_prefix_delete_e7810',
table='ipam_prefix',
when='BEFORE',
),
),
),
pgtrigger.migrations.AddTrigger(
model_name='prefix',
trigger=pgtrigger.compiler.Trigger(
name='ipam_prefix_insert',
sql=pgtrigger.compiler.UpsertTriggerSql(
func="\n-- Update the prefix with the new parent if the parent is the most appropriate prefix\nUPDATE ipam_prefix\nSET parent_id=NEW.id\nWHERE\n prefix << NEW.prefix\n AND\n (\n (vrf_id = NEW.vrf_id OR (vrf_id IS NULL AND NEW.vrf_id IS NULL))\n OR\n (\n NEW.vrf_id IS NULL\n AND\n NEW.status = 'container'\n AND\n NOT EXISTS(\n SELECT 1 FROM ipam_prefix p WHERE p.prefix >> ipam_prefix.prefix AND p.vrf_id = ipam_prefix.vrf_id\n )\n )\n )\n AND id != NEW.id\n AND NOT EXISTS (\n SELECT 1 FROM ipam_prefix p\n WHERE\n p.prefix >> ipam_prefix.prefix\n AND p.prefix << NEW.prefix\n AND (\n (p.vrf_id = ipam_prefix.vrf_id OR (p.vrf_id IS NULL AND ipam_prefix.vrf_id IS NULL))\n OR\n (p.vrf_id IS NULL AND p.status = 'container')\n )\n AND p.id != NEW.id\n )\n;\nRETURN NEW;\n", # noqa: E501
hash='1d71498f09e767183d3b0d29c06c9ac9e2cc000a',
operation='INSERT',
pgid='pgtrigger_ipam_prefix_insert_46c72',
table='ipam_prefix',
when='AFTER',
),
),
),
pgtrigger.migrations.AddTrigger(
model_name='prefix',
trigger=pgtrigger.compiler.Trigger(
name='ipam_prefix_update',
sql=pgtrigger.compiler.UpsertTriggerSql(
func="\n-- When a prefix changes, reassign any IPAddresses that no longer\n-- fall within the new prefix range to the parent prefix (or set null if no parent exists)\nUPDATE ipam_prefix\nSET parent_id = OLD.parent_id\nWHERE\n parent_id = NEW.id\n -- IP address no longer contained within the updated prefix\n AND NOT (prefix << NEW.prefix);\n\n-- Update the prefix with the new parent if the parent is the most appropriate prefix\nUPDATE ipam_prefix\nSET parent_id=NEW.id\nWHERE\n prefix << NEW.prefix\n AND\n (\n (vrf_id = NEW.vrf_id OR (vrf_id IS NULL AND NEW.vrf_id IS NULL))\n OR\n (\n NEW.vrf_id IS NULL\n AND\n NEW.status = 'container'\n AND\n NOT EXISTS(\n SELECT 1 FROM ipam_prefix p WHERE p.prefix >> ipam_prefix.prefix AND p.vrf_id = ipam_prefix.vrf_id\n )\n )\n )\n AND id != NEW.id\n AND NOT EXISTS (\n SELECT 1 FROM ipam_prefix p\n WHERE\n p.prefix >> ipam_prefix.prefix\n AND p.prefix << NEW.prefix\n AND (\n (p.vrf_id = ipam_prefix.vrf_id OR (p.vrf_id IS NULL AND ipam_prefix.vrf_id IS NULL))\n OR\n (p.vrf_id IS NULL AND p.status = 'container')\n )\n AND p.id != NEW.id\n )\n;\nRETURN NEW;\n", # noqa: E501
hash='747230a84703df5a4aa3d32e7f45b5a32525b799',
operation='UPDATE',
pgid='pgtrigger_ipam_prefix_update_e5fca',
table='ipam_prefix',
when='AFTER',
),
),
),
]

View File

@@ -55,13 +55,6 @@ class ASNRange(OrganizationalModel):
def __str__(self):
return f'{self.name} ({self.range_as_string()})'
@property
def range(self):
return range(self.start, self.end + 1)
def range_as_string(self):
return f'{self.start}-{self.end}'
def clean(self):
super().clean()
@@ -72,7 +65,45 @@ class ASNRange(OrganizationalModel):
)
)
@property
def range(self):
"""
Return a range of integers representing the ASN range.
"""
return range(self.start, self.end + 1)
@property
def start_asdot(self):
"""
Return ASDOT notation for AS numbers greater than 16 bits.
"""
return ASNField.to_asdot(self.start)
@property
def end_asdot(self):
"""
Return ASDOT notation for AS numbers greater than 16 bits.
"""
return ASNField.to_asdot(self.end)
def range_as_string(self):
"""
Return a string representation of the ASN range.
"""
return f'{self.start}-{self.end}'
def range_as_string_with_asdot(self):
"""
Return a string representation of the ASN range, including ASDOT notation.
"""
if self.end >= 65536:
return f'{self.range_as_string()} ({self.start_asdot}-{self.end_asdot})'
return self.range_as_string()
def get_child_asns(self):
"""
Return all child ASNs (ASNs within the range).
"""
return ASN.objects.filter(
asn__gte=self.start,
asn__lte=self.end
@@ -131,20 +162,20 @@ class ASN(ContactsMixin, PrimaryModel):
"""
Return ASDOT notation for AS numbers greater than 16 bits.
"""
if self.asn > 65535:
return f'{self.asn // 65536}.{self.asn % 65536}'
return self.asn
return ASNField.to_asdot(self.asn)
@property
def asn_with_asdot(self):
"""
Return both plain and ASDOT notation, where applicable.
"""
if self.asn > 65535:
return f'{self.asn} ({self.asn // 65536}.{self.asn % 65536})'
else:
return self.asn
if self.asn >= 65536:
return f'{self.asn} ({self.asn_asdot})'
return str(self.asn)
@property
def prefixed_name(self):
"""
Return the ASN with ASDOT notation prefixed with "AS".
"""
return f'AS{self.asn_with_asdot}'

View File

@@ -1,5 +1,4 @@
import netaddr
import pgtrigger
from django.contrib.contenttypes.fields import GenericForeignKey
from django.contrib.contenttypes.models import ContentType
from django.contrib.postgres.indexes import GistIndex
@@ -9,7 +8,6 @@ from django.db.models import F
from django.db.models.functions import Cast
from django.utils.functional import cached_property
from django.utils.translation import gettext_lazy as _
from netaddr.ip import IPNetwork
from dcim.models.mixins import CachedScopeMixin
from ipam.choices import *
@@ -18,8 +16,6 @@ from ipam.fields import IPNetworkField, IPAddressField
from ipam.lookups import Host
from ipam.managers import IPAddressManager
from ipam.querysets import PrefixQuerySet
from ipam.triggers import ipam_prefix_delete_adjust_prefix_parent, ipam_prefix_insert_adjust_prefix_parent, \
ipam_prefix_update_adjust_prefix_parent
from ipam.validators import DNSValidator
from netbox.config import get_config
from netbox.models import OrganizationalModel, PrimaryModel
@@ -189,28 +185,31 @@ class Aggregate(ContactsMixin, GetAvailablePrefixesMixin, PrimaryModel):
return min(utilization, 100)
class Role(OrganizationalModel):
"""
A Role represents the functional role of a Prefix or VLAN; for example, "Customer," "Infrastructure," or
"Management."
"""
weight = models.PositiveSmallIntegerField(
verbose_name=_('weight'),
default=1000
)
class Meta:
ordering = ('weight', 'name')
verbose_name = _('role')
verbose_name_plural = _('roles')
def __str__(self):
return self.name
class Prefix(ContactsMixin, GetAvailablePrefixesMixin, CachedScopeMixin, PrimaryModel):
"""
A Prefix represents an IPv4 or IPv6 network, including mask length. Prefixes can optionally be scoped to certain
areas and/or assigned to VRFs. A Prefix must be assigned a status and may optionally be assigned a used-define Role.
A Prefix can also be assigned to a VLAN where appropriate.
"""
aggregate = models.ForeignKey(
to='ipam.Aggregate',
on_delete=models.SET_NULL, # This is handled by triggers
related_name='prefixes',
blank=True,
null=True,
verbose_name=_('aggregate')
)
parent = models.ForeignKey(
to='ipam.Prefix',
on_delete=models.DO_NOTHING,
related_name='children',
blank=True,
null=True,
verbose_name=_('Prefix')
)
prefix = IPNetworkField(
verbose_name=_('prefix'),
help_text=_('IPv4 or IPv6 network with mask')
@@ -285,32 +284,8 @@ class Prefix(ContactsMixin, GetAvailablePrefixesMixin, CachedScopeMixin, Primary
verbose_name_plural = _('prefixes')
indexes = (
models.Index(fields=('scope_type', 'scope_id')),
GistIndex(
fields=['prefix'],
name='ipam_prefix_gist_idx',
opclasses=['inet_ops'],
),
)
triggers = (
pgtrigger.Trigger(
name='ipam_prefix_delete',
operation=pgtrigger.Delete,
when=pgtrigger.Before,
func=ipam_prefix_delete_adjust_prefix_parent,
),
pgtrigger.Trigger(
name='ipam_prefix_insert',
operation=pgtrigger.Insert,
when=pgtrigger.After,
func=ipam_prefix_insert_adjust_prefix_parent,
),
pgtrigger.Trigger(
name='ipam_prefix_update',
operation=pgtrigger.Update,
when=pgtrigger.After,
func=ipam_prefix_update_adjust_prefix_parent,
),
)
GistIndex(fields=['prefix'], name='ipam_prefix_gist_idx', opclasses=['inet_ops']),
)
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
@@ -326,8 +301,6 @@ class Prefix(ContactsMixin, GetAvailablePrefixesMixin, CachedScopeMixin, Primary
super().clean()
if self.prefix:
if not isinstance(self.prefix, IPNetwork):
self.prefix = IPNetwork(self.prefix)
# /0 masks are not acceptable
if self.prefix.prefixlen == 0:
@@ -335,17 +308,6 @@ class Prefix(ContactsMixin, GetAvailablePrefixesMixin, CachedScopeMixin, Primary
'prefix': _("Cannot create prefix with /0 mask.")
})
if self.parent:
if self.prefix not in self.parent.prefix:
raise ValidationError({
'parent': _("Prefix must be part of parent prefix.")
})
if self.parent.status != PrefixStatusChoices.STATUS_CONTAINER and self.vrf != self.parent.vrf:
raise ValidationError({
'vrf': _("VRF must match the parent VRF.")
})
# Enforce unique IP space (if applicable)
if (self.vrf is None and get_config().ENFORCE_GLOBAL_UNIQUE) or (self.vrf and self.vrf.enforce_unique):
duplicate_prefixes = self.get_duplicates()
@@ -359,14 +321,6 @@ class Prefix(ContactsMixin, GetAvailablePrefixesMixin, CachedScopeMixin, Primary
})
def save(self, *args, **kwargs):
vrf_id = self.vrf.pk if self.vrf else None
if not self.pk and not self.parent:
parent = self.find_parent_prefix(self)
self.parent = parent
elif self.parent and (self.prefix != self._prefix or vrf_id != self._vrf_id):
parent = self.find_parent_prefix(self)
self.parent = parent
if isinstance(self.prefix, netaddr.IPNetwork):
@@ -392,11 +346,11 @@ class Prefix(ContactsMixin, GetAvailablePrefixesMixin, CachedScopeMixin, Primary
return netaddr.IPAddress(self.prefix).format(netaddr.ipv6_full)
@property
def depth_count(self):
def depth(self):
return self._depth
@property
def children_count(self):
def children(self):
return self._children
def _set_prefix_length(self, value):
@@ -536,52 +490,11 @@ class Prefix(ContactsMixin, GetAvailablePrefixesMixin, CachedScopeMixin, Primary
return min(utilization, 100)
@classmethod
def find_parent_prefix(cls, network):
prefixes = Prefix.objects.filter(
models.Q(
vrf=network.vrf,
prefix__net_contains=str(network.prefix)
) | models.Q(
vrf=None,
status=PrefixStatusChoices.STATUS_CONTAINER,
prefix__net_contains=str(network.prefix),
)
)
return prefixes.last()
class Role(OrganizationalModel):
"""
A Role represents the functional role of a Prefix or VLAN; for example, "Customer," "Infrastructure," or
"Management."
"""
weight = models.PositiveSmallIntegerField(
verbose_name=_('weight'),
default=1000
)
class Meta:
ordering = ('weight', 'name')
verbose_name = _('role')
verbose_name_plural = _('roles')
def __str__(self):
return self.name
class IPRange(ContactsMixin, PrimaryModel):
"""
A range of IP addresses, defined by start and end addresses.
"""
prefix = models.ForeignKey(
to='ipam.Prefix',
on_delete=models.SET_NULL,
related_name='ip_ranges',
null=True,
blank=True,
verbose_name=_('prefix'),
)
start_address = IPAddressField(
verbose_name=_('start address'),
help_text=_('IPv4 or IPv6 address (with mask)')
@@ -651,27 +564,6 @@ class IPRange(ContactsMixin, PrimaryModel):
super().clean()
if self.start_address and self.end_address:
# If prefix is set, validate suitability
if self.prefix:
# Check that start address and end address are within the prefix range
if self.start_address not in self.prefix.prefix and self.end_address not in self.prefix.prefix:
raise ValidationError({
'start_address': _("Start address must be part of the selected prefix"),
'end_address': _("End address must be part of the selected prefix.")
})
elif self.start_address not in self.prefix.prefix:
raise ValidationError({
'start_address': _("Start address must be part of the selected prefix")
})
elif self.end_address not in self.prefix.prefix:
raise ValidationError({
'end_address': _("End address must be part of the selected prefix.")
})
# Check that VRF matches prefix VRF
if self.vrf != self.prefix.vrf:
raise ValidationError({
'vrf': _("VRF must match the prefix VRF.")
})
# Check that start & end IP versions match
if self.start_address.version != self.end_address.version:
@@ -828,14 +720,6 @@ class IPRange(ContactsMixin, PrimaryModel):
return min(float(child_count) / self.size * 100, 100)
@classmethod
def find_prefix(self, address):
prefixes = Prefix.objects.filter(
models.Q(prefix__net_contains=address.start_address) & Q(prefix__net_contains=address.end_address),
vrf=address.vrf,
)
return prefixes.last()
class IPAddress(ContactsMixin, PrimaryModel):
"""
@@ -848,14 +732,6 @@ class IPAddress(ContactsMixin, PrimaryModel):
for example, when mapping public addresses to private addresses. When an Interface has been assigned an IPAddress
which has a NAT outside IP, that Interface's Device can use either the inside or outside IP as its primary IP.
"""
prefix = models.ForeignKey(
to='ipam.Prefix',
on_delete=models.SET_NULL,
related_name='ip_addresses',
blank=True,
null=True,
verbose_name=_('Prefix')
)
address = IPAddressField(
verbose_name=_('address'),
help_text=_('IPv4 or IPv6 address (with mask)')
@@ -943,7 +819,6 @@ class IPAddress(ContactsMixin, PrimaryModel):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self._address = self.address
# Denote the original assigned object (if any) for validation in clean()
self._original_assigned_object_id = self.__dict__.get('assigned_object_id')
self._original_assigned_object_type_id = self.__dict__.get('assigned_object_type_id')
@@ -990,16 +865,6 @@ class IPAddress(ContactsMixin, PrimaryModel):
super().clean()
if self.address:
# If prefix is set, validate suitability
if self.prefix:
if self.address not in self.prefix.prefix:
raise ValidationError({
'prefix': _("IP address must be part of the selected prefix.")
})
if self.vrf != self.prefix.vrf:
raise ValidationError({
'vrf': _("IP address VRF must match the prefix VRF.")
})
# /0 masks are not acceptable
if self.address.prefixlen == 0:
@@ -1042,13 +907,13 @@ class IPAddress(ContactsMixin, PrimaryModel):
})
# Disallow the creation of IPAddresses within an IPRange with mark_populated=True
parent_range = IPRange.objects.filter(
parent_range_qs = IPRange.objects.filter(
start_address__lte=self.address,
end_address__gte=self.address,
vrf=self.vrf,
mark_populated=True
).first()
if parent_range:
)
if not self.pk and (parent_range := parent_range_qs.first()):
raise ValidationError({
'address': _(
"Cannot create IP address {ip} inside range {range}."
@@ -1140,8 +1005,3 @@ class IPAddress(ContactsMixin, PrimaryModel):
def get_role_color(self):
return IPAddressRoleChoices.colors.get(self.role)
@classmethod
def find_prefix(self, address):
prefixes = Prefix.objects.filter(prefix__net_contains=address.address, vrf=address.vrf)
return prefixes.last()

View File

@@ -53,12 +53,11 @@ class IPAddressIndex(SearchIndex):
model = models.IPAddress
fields = (
('address', 100),
('prefix', 200),
('dns_name', 300),
('description', 500),
('comments', 5000),
)
display_attrs = ('prefix', 'vrf', 'tenant', 'status', 'role', 'description')
display_attrs = ('vrf', 'tenant', 'status', 'role', 'description')
@register_search
@@ -67,11 +66,10 @@ class IPRangeIndex(SearchIndex):
fields = (
('start_address', 100),
('end_address', 300),
('prefix', 400),
('description', 500),
('comments', 5000),
)
display_attrs = ('prefix', 'vrf', 'tenant', 'status', 'role', 'description')
display_attrs = ('vrf', 'tenant', 'status', 'role', 'description')
@register_search
@@ -79,12 +77,10 @@ class PrefixIndex(SearchIndex):
model = models.Prefix
fields = (
('prefix', 110),
('parent', 200),
('aggregate', 300),
('description', 500),
('comments', 5000),
)
display_attrs = ('scope', 'aggregate', 'parent', 'vrf', 'tenant', 'vlan', 'status', 'role', 'description')
display_attrs = ('scope', 'vrf', 'tenant', 'vlan', 'status', 'role', 'description')
@register_search

View File

@@ -20,6 +20,16 @@ class ASNRangeTable(TenancyColumnsMixin, OrganizationalModelTable):
verbose_name=_('RIR'),
linkify=True
)
start_asdot = tables.Column(
accessor=tables.A('start_asdot'),
order_by=tables.A('start'),
verbose_name=_('Start (ASDOT)')
)
end_asdot = tables.Column(
accessor=tables.A('end_asdot'),
order_by=tables.A('end'),
verbose_name=_('End (ASDOT)')
)
tags = columns.TagColumn(
url_name='ipam:asnrange_list'
)
@@ -30,8 +40,8 @@ class ASNRangeTable(TenancyColumnsMixin, OrganizationalModelTable):
class Meta(OrganizationalModelTable.Meta):
model = ASNRange
fields = (
'pk', 'name', 'slug', 'rir', 'start', 'end', 'asn_count', 'tenant', 'tenant_group', 'description',
'comments', 'tags', 'created', 'last_updated', 'actions',
'pk', 'name', 'slug', 'rir', 'start', 'start_asdot', 'end', 'end_asdot', 'asn_count', 'tenant',
'tenant_group', 'description', 'comments', 'tags', 'created', 'last_updated', 'actions',
)
default_columns = ('pk', 'name', 'rir', 'start', 'end', 'tenant', 'asn_count', 'description')

View File

@@ -152,10 +152,6 @@ class PrefixUtilizationColumn(columns.UtilizationColumn):
class PrefixTable(TenancyColumnsMixin, ContactsColumnMixin, PrimaryModelTable):
parent = tables.Column(
verbose_name=_('Parent'),
linkify=True
)
prefix = columns.TemplateColumn(
verbose_name=_('Prefix'),
template_code=PREFIX_LINK_WITH_DEPTH,
@@ -234,9 +230,9 @@ class PrefixTable(TenancyColumnsMixin, ContactsColumnMixin, PrimaryModelTable):
class Meta(PrimaryModelTable.Meta):
model = Prefix
fields = (
'pk', 'id', 'prefix', 'status', 'parent', 'prefix', 'prefix_flat', 'children', 'vrf', 'utilization',
'tenant', 'tenant_group', 'scope', 'scope_type', 'vlan_group', 'vlan', 'role', 'is_pool', 'mark_utilized',
'contacts', 'description', 'comments', 'tags', 'created', 'last_updated',
'pk', 'id', 'prefix', 'prefix_flat', 'status', 'children', 'vrf', 'utilization', 'tenant', 'tenant_group',
'scope', 'scope_type', 'vlan_group', 'vlan', 'role', 'is_pool', 'mark_utilized', 'description', 'contacts',
'comments', 'tags', 'created', 'last_updated',
)
default_columns = (
'pk', 'prefix', 'status', 'children', 'vrf', 'utilization', 'tenant', 'scope', 'vlan', 'role',
@@ -250,11 +246,8 @@ class PrefixTable(TenancyColumnsMixin, ContactsColumnMixin, PrimaryModelTable):
#
# IP ranges
#
class IPRangeTable(TenancyColumnsMixin, ContactsColumnMixin, PrimaryModelTable):
prefix = tables.Column(
verbose_name=_('Prefix'),
linkify=True
)
start_address = tables.Column(
verbose_name=_('Start address'),
linkify=True
@@ -291,9 +284,9 @@ class IPRangeTable(TenancyColumnsMixin, ContactsColumnMixin, PrimaryModelTable):
class Meta(PrimaryModelTable.Meta):
model = IPRange
fields = (
'pk', 'id', 'start_address', 'end_address', 'prefix', 'size', 'vrf', 'status', 'role', 'tenant',
'tenant_group', 'mark_populated', 'mark_utilized', 'utilization', 'description', 'contacts',
'comments', 'tags', 'created', 'last_updated',
'pk', 'id', 'start_address', 'end_address', 'size', 'vrf', 'status', 'role', 'tenant', 'tenant_group',
'mark_populated', 'mark_utilized', 'utilization', 'description', 'contacts', 'comments', 'tags',
'created', 'last_updated',
)
default_columns = (
'pk', 'start_address', 'end_address', 'size', 'vrf', 'status', 'role', 'tenant', 'description',
@@ -308,18 +301,10 @@ class IPRangeTable(TenancyColumnsMixin, ContactsColumnMixin, PrimaryModelTable):
#
class IPAddressTable(TenancyColumnsMixin, ContactsColumnMixin, PrimaryModelTable):
prefix = tables.Column(
verbose_name=_('Prefix'),
linkify=True
)
address = tables.TemplateColumn(
template_code=IPADDRESS_LINK,
verbose_name=_('IP Address')
)
prefix = tables.Column(
linkify=True,
verbose_name=_('Prefix')
)
vrf = tables.TemplateColumn(
template_code=VRF_LINK,
verbose_name=_('VRF')
@@ -368,9 +353,8 @@ class IPAddressTable(TenancyColumnsMixin, ContactsColumnMixin, PrimaryModelTable
class Meta(PrimaryModelTable.Meta):
model = IPAddress
fields = (
'pk', 'id', 'address', 'vrf', 'prefix', 'status', 'role', 'tenant', 'tenant_group', 'nat_inside',
'nat_outside', 'assigned', 'dns_name', 'description', 'comments', 'contacts', 'tags', 'created',
'last_updated',
'pk', 'id', 'address', 'vrf', 'status', 'role', 'tenant', 'tenant_group', 'nat_inside', 'nat_outside',
'assigned', 'dns_name', 'description', 'comments', 'contacts', 'tags', 'created', 'last_updated',
)
default_columns = (
'pk', 'address', 'vrf', 'status', 'role', 'tenant', 'assigned', 'dns_name', 'description',

View File

@@ -16,20 +16,12 @@ PREFIX_COPY_BUTTON = """
PREFIX_LINK_WITH_DEPTH = """
{% load helpers %}
{% if record.depth_count %}
{% if object %}
<div class="record-depth">
{% for i in record.depth_count|parent_depth:object|as_range %}
<span>•</span>
{% endfor %}
</div>
{% else %}
<div class="record-depth">
{% for i in record.depth_count|as_range %}
<span>•</span>
{% endfor %}
</div>
{% endif %}
{% if record.depth %}
<div class="record-depth">
{% for i in record.depth|as_range %}
<span>•</span>
{% endfor %}
</div>
{% endif %}
""" + PREFIX_LINK

View File

@@ -407,8 +407,7 @@ class RoleTest(APIViewTestCases.APIViewTestCase):
class PrefixTest(APIViewTestCases.APIViewTestCase):
model = Prefix
# TODO: Alter for parent prefix
brief_fields = ['_depth', 'aggregate', 'description', 'display', 'family', 'id', 'parent', 'prefix', 'url']
brief_fields = ['_depth', 'description', 'display', 'family', 'id', 'prefix', 'url']
create_data = [
{
'prefix': '192.168.4.0/24',
@@ -623,8 +622,7 @@ class PrefixTest(APIViewTestCases.APIViewTestCase):
class IPRangeTest(APIViewTestCases.APIViewTestCase):
model = IPRange
# TODO: Alter for parent prefix
brief_fields = ['description', 'display', 'end_address', 'family', 'id', 'prefix', 'start_address', 'url']
brief_fields = ['description', 'display', 'end_address', 'family', 'id', 'start_address', 'url']
create_data = [
{
'start_address': '192.168.4.10/24',
@@ -782,8 +780,7 @@ class IPRangeTest(APIViewTestCases.APIViewTestCase):
class IPAddressTest(APIViewTestCases.APIViewTestCase):
model = IPAddress
# TODO: Alter for parent prefix
brief_fields = ['address', 'description', 'display', 'family', 'id', 'prefix', 'url']
brief_fields = ['address', 'description', 'display', 'family', 'id', 'url']
create_data = [
{
'address': '192.168.0.4/24',
@@ -1074,14 +1071,17 @@ class VLANGroupTest(APIViewTestCases.APIViewTestCase):
{
'name': 'VLAN Group 4',
'slug': 'vlan-group-4',
'vid_ranges': [[1, 4094]]
},
{
'name': 'VLAN Group 5',
'slug': 'vlan-group-5',
'vid_ranges': [[1, 4094]]
},
{
'name': 'VLAN Group 6',
'slug': 'vlan-group-6',
'vid_ranges': [[1, 4094]]
},
]
bulk_update_data = {

View File

@@ -901,10 +901,6 @@ class PrefixTestCase(TestCase, ChangeLoggedFilterSetTests):
params = {'description': ['foobar1', 'foobar2']}
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 2)
# TODO: Test for parent prefix
# TODO: Test for children?
# TODO: Test for aggregate
class IPRangeTestCase(TestCase, ChangeLoggedFilterSetTests):
queryset = IPRange.objects.all()
@@ -1083,7 +1079,6 @@ class IPRangeTestCase(TestCase, ChangeLoggedFilterSetTests):
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 2)
def test_parent(self):
# TODO: Alter for prefix
params = {'parent': ['10.0.1.0/24', '10.0.2.0/24']}
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 2)
params = {'parent': ['10.0.1.0/25']} # Range 10.0.1.100-199 is not fully contained by 10.0.1.0/25
@@ -1323,7 +1318,6 @@ class IPAddressTestCase(TestCase, ChangeLoggedFilterSetTests):
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 2)
def test_parent(self):
# TODO: Alter for prefix
params = {'parent': ['10.0.0.0/30', '2001:db8::/126']}
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 8)

View File

@@ -39,26 +39,6 @@ class TestAggregate(TestCase):
class TestIPRange(TestCase):
@classmethod
def setUpTestData(cls):
cls.vrf = VRF.objects.create(name='VRF A', rd='1:1')
cls.prefixes = (
# IPv4
Prefix(prefix='192.0.0.0/16'),
Prefix(prefix='192.0.2.0/24'),
Prefix(prefix='192.0.0.0/16', vrf=cls.vrf),
# IPv6
Prefix(prefix='2001:db8::/32'),
Prefix(prefix='2001:db8::/64'),
)
for prefix in cls.prefixes:
prefix.clean()
prefix.save()
def test_overlapping_range(self):
iprange_192_168 = IPRange.objects.create(
@@ -107,69 +87,6 @@ class TestIPRange(TestCase):
)
iprange_4_198_201.clean()
def test_parent_prefix(self):
ranges = (
IPRange(
start_address=IPNetwork('192.0.0.1/24'),
end_address=IPNetwork('192.0.0.254/24'),
prefix=self.prefixes[0]
),
IPRange(
start_address=IPNetwork('192.0.2.1/24'),
end_address=IPNetwork('192.0.2.254/24'),
prefix=self.prefixes[1]
),
IPRange(
start_address=IPNetwork('192.0.2.1/24'),
end_address=IPNetwork('192.0.2.254/24'),
vrf=self.vrf,
prefix=self.prefixes[2]
),
IPRange(
start_address=IPNetwork('2001:db8::/64'),
end_address=IPNetwork('2001:db8::ffff/64'),
prefix=self.prefixes[4]
),
IPRange(
start_address=IPNetwork('2001:db8:2::/64'),
end_address=IPNetwork('2001:db8:2::ffff/64'),
prefix=self.prefixes[3]
),
)
for range in ranges:
range.clean()
range.save()
self.assertEqual(ranges[0].prefix, self.prefixes[0])
self.assertEqual(ranges[1].prefix, self.prefixes[1])
self.assertEqual(ranges[2].prefix, self.prefixes[2])
self.assertEqual(ranges[3].prefix, self.prefixes[4])
def test_parent_prefix_change(self):
range = IPRange(
start_address=IPNetwork('192.0.1.1/24'),
end_address=IPNetwork('192.0.1.254/24'),
prefix=self.prefixes[0]
)
range.clean()
range.save()
prefix = Prefix(prefix='192.0.0.0/17')
prefix.clean()
prefix.save()
range.refresh_from_db()
self.assertEqual(range.prefix, prefix)
# TODO: Prefix Altered
# TODO: Prefix Deleted
# TODO: Prefix falls outside range
# TODO: Prefix VRF does not match range VRF
class TestPrefix(TestCase):
@@ -252,21 +169,19 @@ class TestPrefix(TestCase):
prefix=IPNetwork('10.0.0.0/16'), status=PrefixStatusChoices.STATUS_CONTAINER
)
ips = IPAddress.objects.bulk_create((
IPAddress(prefix=parent_prefix, address=IPNetwork('10.0.0.1/24'), vrf=None),
IPAddress(prefix=parent_prefix, address=IPNetwork('10.0.1.1/24'), vrf=vrfs[0]),
IPAddress(prefix=parent_prefix, address=IPNetwork('10.0.2.1/24'), vrf=vrfs[1]),
IPAddress(prefix=parent_prefix, address=IPNetwork('10.0.3.1/24'), vrf=vrfs[2]),
IPAddress(address=IPNetwork('10.0.0.1/24'), vrf=None),
IPAddress(address=IPNetwork('10.0.1.1/24'), vrf=vrfs[0]),
IPAddress(address=IPNetwork('10.0.2.1/24'), vrf=vrfs[1]),
IPAddress(address=IPNetwork('10.0.3.1/24'), vrf=vrfs[2]),
))
child_ip_pks = {p.pk for p in parent_prefix.ip_addresses.all()}
child_ip_pks = {p.pk for p in parent_prefix.get_child_ips()}
# Global container should return all children
self.assertSetEqual(child_ip_pks, {ips[0].pk, ips[1].pk, ips[2].pk, ips[3].pk})
parent_prefix.vrf = vrfs[0]
parent_prefix.save()
parent_prefix.refresh_from_db()
child_ip_pks = {p.pk for p in parent_prefix.ip_addresses.all()}
child_ip_pks = {p.pk for p in parent_prefix.get_child_ips()}
# VRF container is limited to its own VRF
self.assertSetEqual(child_ip_pks, {ips[1].pk})
@@ -429,21 +344,17 @@ class TestPrefixHierarchy(TestCase):
prefixes = (
# IPv4
Prefix(prefix='10.0.0.0/8'),
Prefix(prefix='10.0.0.0/16'),
Prefix(prefix='10.0.0.0/24'),
Prefix(prefix='192.168.0.0/16'),
Prefix(prefix='10.0.0.0/8', _depth=0, _children=2),
Prefix(prefix='10.0.0.0/16', _depth=1, _children=1),
Prefix(prefix='10.0.0.0/24', _depth=2, _children=0),
# IPv6
Prefix(prefix='2001:db8::/32'),
Prefix(prefix='2001:db8::/40'),
Prefix(prefix='2001:db8::/48'),
Prefix(prefix='2001:db8::/32', _depth=0, _children=2),
Prefix(prefix='2001:db8::/40', _depth=1, _children=1),
Prefix(prefix='2001:db8::/48', _depth=2, _children=0),
)
for prefix in prefixes:
prefix.clean()
prefix.save()
Prefix.objects.bulk_create(prefixes)
def test_create_prefix4(self):
# Create 10.0.0.0/12
@@ -451,19 +362,15 @@ class TestPrefixHierarchy(TestCase):
prefixes = Prefix.objects.filter(prefix__family=4)
self.assertEqual(prefixes[0].prefix, IPNetwork('10.0.0.0/8'))
self.assertEqual(prefixes[0].parent, None)
self.assertEqual(prefixes[0]._depth, 0)
self.assertEqual(prefixes[0]._children, 3)
self.assertEqual(prefixes[1].prefix, IPNetwork('10.0.0.0/12'))
self.assertEqual(prefixes[1].parent.prefix, IPNetwork('10.0.0.0/8'))
self.assertEqual(prefixes[1]._depth, 1)
self.assertEqual(prefixes[1]._children, 2)
self.assertEqual(prefixes[2].prefix, IPNetwork('10.0.0.0/16'))
self.assertEqual(prefixes[2].parent.prefix, IPNetwork('10.0.0.0/12'))
self.assertEqual(prefixes[2]._depth, 2)
self.assertEqual(prefixes[2]._children, 1)
self.assertEqual(prefixes[3].prefix, IPNetwork('10.0.0.0/24'))
self.assertEqual(prefixes[3].parent.prefix, IPNetwork('10.0.0.0/16'))
self.assertEqual(prefixes[3]._depth, 3)
self.assertEqual(prefixes[3]._children, 0)
@@ -473,19 +380,15 @@ class TestPrefixHierarchy(TestCase):
prefixes = Prefix.objects.filter(prefix__family=6)
self.assertEqual(prefixes[0].prefix, IPNetwork('2001:db8::/32'))
self.assertEqual(prefixes[0].parent, None)
self.assertEqual(prefixes[0]._depth, 0)
self.assertEqual(prefixes[0]._children, 3)
self.assertEqual(prefixes[1].prefix, IPNetwork('2001:db8::/36'))
self.assertEqual(prefixes[1].parent.prefix, IPNetwork('2001:db8::/32'))
self.assertEqual(prefixes[1]._depth, 1)
self.assertEqual(prefixes[1]._children, 2)
self.assertEqual(prefixes[2].prefix, IPNetwork('2001:db8::/40'))
self.assertEqual(prefixes[2].parent.prefix, IPNetwork('2001:db8::/36'))
self.assertEqual(prefixes[2]._depth, 2)
self.assertEqual(prefixes[2]._children, 1)
self.assertEqual(prefixes[3].prefix, IPNetwork('2001:db8::/48'))
self.assertEqual(prefixes[3].parent.prefix, IPNetwork('2001:db8::/40'))
self.assertEqual(prefixes[3]._depth, 3)
self.assertEqual(prefixes[3]._children, 0)
@@ -497,15 +400,12 @@ class TestPrefixHierarchy(TestCase):
prefixes = Prefix.objects.filter(prefix__family=4)
self.assertEqual(prefixes[0].prefix, IPNetwork('10.0.0.0/8'))
self.assertEqual(prefixes[0].parent, None)
self.assertEqual(prefixes[0]._depth, 0)
self.assertEqual(prefixes[0]._children, 2)
self.assertEqual(prefixes[1].prefix, IPNetwork('10.0.0.0/12'))
self.assertEqual(prefixes[1].parent.prefix, IPNetwork('10.0.0.0/8'))
self.assertEqual(prefixes[1]._depth, 1)
self.assertEqual(prefixes[1]._children, 1)
self.assertEqual(prefixes[2].prefix, IPNetwork('10.0.0.0/16'))
self.assertEqual(prefixes[2].parent.prefix, IPNetwork('10.0.0.0/12'))
self.assertEqual(prefixes[2]._depth, 2)
self.assertEqual(prefixes[2]._children, 0)
@@ -517,15 +417,12 @@ class TestPrefixHierarchy(TestCase):
prefixes = Prefix.objects.filter(prefix__family=6)
self.assertEqual(prefixes[0].prefix, IPNetwork('2001:db8::/32'))
self.assertEqual(prefixes[0].parent, None)
self.assertEqual(prefixes[0]._depth, 0)
self.assertEqual(prefixes[0]._children, 2)
self.assertEqual(prefixes[1].prefix, IPNetwork('2001:db8::/36'))
self.assertEqual(prefixes[1].parent.prefix, IPNetwork('2001:db8::/32'))
self.assertEqual(prefixes[1]._depth, 1)
self.assertEqual(prefixes[1]._children, 1)
self.assertEqual(prefixes[2].prefix, IPNetwork('2001:db8::/40'))
self.assertEqual(prefixes[2].parent.prefix, IPNetwork('2001:db8::/36'))
self.assertEqual(prefixes[2]._depth, 2)
self.assertEqual(prefixes[2]._children, 0)
@@ -540,17 +437,14 @@ class TestPrefixHierarchy(TestCase):
prefixes = Prefix.objects.filter(vrf__isnull=True, prefix__family=4)
self.assertEqual(prefixes[0].prefix, IPNetwork('10.0.0.0/8'))
self.assertEqual(prefixes[0].parent, None)
self.assertEqual(prefixes[0]._depth, 0)
self.assertEqual(prefixes[0]._children, 1)
self.assertEqual(prefixes[1].prefix, IPNetwork('10.0.0.0/24'))
self.assertEqual(prefixes[1].parent.prefix, IPNetwork('10.0.0.0/8'))
self.assertEqual(prefixes[1]._depth, 1)
self.assertEqual(prefixes[1]._children, 0)
prefixes = Prefix.objects.filter(vrf=vrf)
self.assertEqual(prefixes[0].prefix, IPNetwork('10.0.0.0/16'))
self.assertEqual(prefixes[0].parent, None)
self.assertEqual(prefixes[0]._depth, 0)
self.assertEqual(prefixes[0]._children, 0)
@@ -565,17 +459,14 @@ class TestPrefixHierarchy(TestCase):
prefixes = Prefix.objects.filter(vrf__isnull=True, prefix__family=6)
self.assertEqual(prefixes[0].prefix, IPNetwork('2001:db8::/32'))
self.assertEqual(prefixes[0].parent, None)
self.assertEqual(prefixes[0]._depth, 0)
self.assertEqual(prefixes[0]._children, 1)
self.assertEqual(prefixes[1].prefix, IPNetwork('2001:db8::/48'))
self.assertEqual(prefixes[1].parent.prefix, IPNetwork('2001:db8::/32'))
self.assertEqual(prefixes[1]._depth, 1)
self.assertEqual(prefixes[1]._children, 0)
prefixes = Prefix.objects.filter(vrf=vrf)
self.assertEqual(prefixes[0].prefix, IPNetwork('2001:db8::/40'))
self.assertEqual(prefixes[0].parent, None)
self.assertEqual(prefixes[0]._depth, 0)
self.assertEqual(prefixes[0]._children, 0)
@@ -585,11 +476,9 @@ class TestPrefixHierarchy(TestCase):
prefixes = Prefix.objects.filter(prefix__family=4)
self.assertEqual(prefixes[0].prefix, IPNetwork('10.0.0.0/8'))
self.assertEqual(prefixes[0].parent, None)
self.assertEqual(prefixes[0]._depth, 0)
self.assertEqual(prefixes[0]._children, 1)
self.assertEqual(prefixes[1].prefix, IPNetwork('10.0.0.0/24'))
self.assertEqual(prefixes[1].parent.prefix, IPNetwork('10.0.0.0/8'))
self.assertEqual(prefixes[1]._depth, 1)
self.assertEqual(prefixes[1]._children, 0)
@@ -599,11 +488,9 @@ class TestPrefixHierarchy(TestCase):
prefixes = Prefix.objects.filter(prefix__family=6)
self.assertEqual(prefixes[0].prefix, IPNetwork('2001:db8::/32'))
self.assertEqual(prefixes[0].parent, None)
self.assertEqual(prefixes[0]._depth, 0)
self.assertEqual(prefixes[0]._children, 1)
self.assertEqual(prefixes[1].prefix, IPNetwork('2001:db8::/48'))
self.assertEqual(prefixes[1].parent.prefix, IPNetwork('2001:db8::/32'))
self.assertEqual(prefixes[1]._depth, 1)
self.assertEqual(prefixes[1]._children, 0)
@@ -613,20 +500,15 @@ class TestPrefixHierarchy(TestCase):
prefixes = Prefix.objects.filter(prefix__family=4)
self.assertEqual(prefixes[0].prefix, IPNetwork('10.0.0.0/8'))
self.assertEqual(prefixes[0].parent, None)
self.assertEqual(prefixes[0]._depth, 0)
self.assertEqual(prefixes[0]._children, 3)
self.assertEqual(prefixes[1].prefix, IPNetwork('10.0.0.0/16'))
self.assertEqual(prefixes[1].parent.prefix, IPNetwork('10.0.0.0/8'))
self.assertEqual(prefixes[1]._depth, 1)
self.assertEqual(prefixes[1]._children, 1)
self.assertEqual(prefixes[2].prefix, IPNetwork('10.0.0.0/16'))
self.assertEqual(prefixes[2].parent.prefix, IPNetwork('10.0.0.0/8'))
self.assertEqual(prefixes[2]._depth, 1)
self.assertEqual(prefixes[2]._children, 1)
self.assertEqual(prefixes[3].prefix, IPNetwork('10.0.0.0/24'))
# TODO: How to we resolve the parent for duplicate prefixes
self.assertEqual(prefixes[3].parent.prefix, IPNetwork('10.0.0.0/16'))
self.assertEqual(prefixes[3]._depth, 2)
self.assertEqual(prefixes[3]._children, 0)
@@ -636,158 +518,20 @@ class TestPrefixHierarchy(TestCase):
prefixes = Prefix.objects.filter(prefix__family=6)
self.assertEqual(prefixes[0].prefix, IPNetwork('2001:db8::/32'))
self.assertEqual(prefixes[0].parent, None)
self.assertEqual(prefixes[0]._depth, 0)
self.assertEqual(prefixes[0]._children, 3)
self.assertEqual(prefixes[1].prefix, IPNetwork('2001:db8::/40'))
self.assertEqual(prefixes[1].parent.prefix, IPNetwork('2001:db8::/32'))
self.assertEqual(prefixes[1]._depth, 1)
self.assertEqual(prefixes[1]._children, 1)
self.assertEqual(prefixes[2].prefix, IPNetwork('2001:db8::/40'))
self.assertEqual(prefixes[2].parent.prefix, IPNetwork('2001:db8::/32'))
self.assertEqual(prefixes[2]._depth, 1)
self.assertEqual(prefixes[2]._children, 1)
self.assertEqual(prefixes[3].prefix, IPNetwork('2001:db8::/48'))
self.assertEqual(prefixes[3].parent.prefix, IPNetwork('2001:db8::/40'))
self.assertEqual(prefixes[3]._depth, 2)
self.assertEqual(prefixes[3]._children, 0)
class TestTriggers(TestCase):
"""
Test the automatic updating of depth and child count in response to changes made within
the prefix hierarchy.
"""
@classmethod
def setUpTestData(cls):
vrfs = (
VRF(name='VRF A'),
VRF(name='VRF B'),
)
for vrf in vrfs:
vrf.clean()
vrf.save()
cls.prefixes = (
# IPv4
Prefix(prefix='10.0.0.0/8'),
Prefix(prefix='10.0.0.0/16'),
Prefix(prefix='10.0.0.0/22'),
Prefix(prefix='10.0.0.0/23'),
Prefix(prefix='10.0.2.0/23'),
Prefix(prefix='10.0.0.0/24'),
Prefix(prefix='10.0.1.0/24'),
Prefix(prefix='10.0.2.0/24'),
Prefix(prefix='10.0.3.0/24'),
Prefix(prefix='10.1.0.0/16', status='container'),
Prefix(prefix='10.1.0.0/22', vrf=vrfs[0]),
Prefix(prefix='10.1.0.0/23', vrf=vrfs[0]),
Prefix(prefix='10.1.2.0/23', vrf=vrfs[0]),
Prefix(prefix='10.1.0.0/24', vrf=vrfs[0]),
Prefix(prefix='10.1.1.0/24', vrf=vrfs[0]),
Prefix(prefix='10.1.2.0/24', vrf=vrfs[0]),
Prefix(prefix='10.1.3.0/24', vrf=vrfs[0]),
)
for prefix in cls.prefixes:
prefix.clean()
prefix.save()
def test_current_hierarchy(self):
self.assertIsNone(Prefix.objects.get(prefix='10.0.0.0/8').parent)
self.assertEqual(Prefix.objects.get(prefix='10.0.0.0/16').parent, Prefix.objects.get(prefix='10.0.0.0/8'))
self.assertEqual(Prefix.objects.get(prefix='10.0.0.0/22').parent, Prefix.objects.get(prefix='10.0.0.0/16'))
self.assertEqual(Prefix.objects.get(prefix='10.0.0.0/23').parent, Prefix.objects.get(prefix='10.0.0.0/22'))
self.assertEqual(Prefix.objects.get(prefix='10.0.2.0/23').parent, Prefix.objects.get(prefix='10.0.0.0/22'))
self.assertEqual(Prefix.objects.get(prefix='10.0.0.0/24').parent, Prefix.objects.get(prefix='10.0.0.0/23'))
self.assertEqual(Prefix.objects.get(prefix='10.0.1.0/24').parent, Prefix.objects.get(prefix='10.0.0.0/23'))
self.assertEqual(Prefix.objects.get(prefix='10.0.2.0/24').parent, Prefix.objects.get(prefix='10.0.2.0/23'))
self.assertEqual(Prefix.objects.get(prefix='10.0.3.0/24').parent, Prefix.objects.get(prefix='10.0.2.0/23'))
def test_basic_insert(self):
pfx = Prefix.objects.create(prefix='10.0.0.0/21')
self.assertIsNotNone(Prefix.objects.get(prefix='10.0.0.0/22').parent)
self.assertEqual(Prefix.objects.get(prefix='10.0.0.0/22').parent, pfx)
def test_vrf_insert(self):
vrf = VRF.objects.get(name='VRF A')
pfx = Prefix.objects.create(prefix='10.1.0.0/21', vrf=vrf)
parent = Prefix.objects.get(prefix='10.1.0.0/16')
self.assertIsNotNone(Prefix.objects.get(prefix='10.1.0.0/21').parent)
self.assertEqual(Prefix.objects.get(prefix='10.1.0.0/21').parent, parent)
self.assertIsNotNone(Prefix.objects.get(prefix='10.1.0.0/22').parent)
self.assertEqual(Prefix.objects.get(prefix='10.1.0.0/22').parent, pfx)
def test_basic_delete(self):
Prefix.objects.get(prefix='10.0.0.0/23').delete()
parent = Prefix.objects.get(prefix='10.0.0.0/22')
self.assertEqual(Prefix.objects.get(prefix='10.0.0.0/24').parent, parent)
self.assertEqual(Prefix.objects.get(prefix='10.0.1.0/24').parent, parent)
self.assertEqual(Prefix.objects.get(prefix='10.0.2.0/24').parent, Prefix.objects.get(prefix='10.0.2.0/23'))
def test_vrf_delete(self):
Prefix.objects.get(prefix='10.1.0.0/23').delete()
parent = Prefix.objects.get(prefix='10.1.0.0/22')
self.assertEqual(Prefix.objects.get(prefix='10.1.0.0/24').parent, parent)
self.assertEqual(Prefix.objects.get(prefix='10.1.1.0/24').parent, parent)
self.assertEqual(Prefix.objects.get(prefix='10.1.2.0/24').parent, Prefix.objects.get(prefix='10.1.2.0/23'))
def test_basic_update(self):
pfx = Prefix.objects.get(prefix='10.0.0.0/23')
parent = Prefix.objects.get(prefix='10.0.0.0/22')
pfx.prefix = '10.3.0.0/23'
pfx.parent = Prefix.objects.get(prefix='10.0.0.0/8')
pfx.clean()
pfx.save()
self.assertEqual(Prefix.objects.get(prefix='10.0.0.0/24').parent, parent)
self.assertEqual(Prefix.objects.get(prefix='10.0.1.0/24').parent, parent)
self.assertEqual(Prefix.objects.get(prefix='10.0.2.0/24').parent, Prefix.objects.get(prefix='10.0.2.0/23'))
def test_vrf_update(self):
pfx = Prefix.objects.get(prefix='10.1.0.0/23')
parent = Prefix.objects.get(prefix='10.1.0.0/22')
pfx.prefix = '10.3.0.0/23'
pfx.parent = None
pfx.clean()
pfx.save()
self.assertEqual(Prefix.objects.get(prefix='10.1.0.0/24').parent, parent)
self.assertEqual(Prefix.objects.get(prefix='10.1.1.0/24').parent, parent)
self.assertEqual(Prefix.objects.get(prefix='10.1.2.0/24').parent, Prefix.objects.get(prefix='10.1.2.0/23'))
# TODO: Test VRF Changes
class TestIPAddress(TestCase):
"""
Test the automatic updating of depth and child count in response to changes made within
the prefix hierarchy.
"""
@classmethod
def setUpTestData(cls):
cls.vrf = VRF.objects.create(name='VRF A', rd='1:1')
cls.prefixes = (
# IPv4
Prefix(prefix='192.0.0.0/16'),
Prefix(prefix='192.0.2.0/24'),
Prefix(prefix='192.0.0.0/16', vrf=cls.vrf),
# IPv6
Prefix(prefix='2001:db8::/32'),
Prefix(prefix='2001:db8::/64'),
)
for prefix in cls.prefixes:
prefix.clean()
prefix.save()
def test_get_duplicates(self):
ips = IPAddress.objects.bulk_create((
@@ -799,44 +543,6 @@ class TestIPAddress(TestCase):
self.assertSetEqual(set(duplicate_ip_pks), {ips[1].pk, ips[2].pk})
def test_parent_prefix(self):
ips = (
IPAddress(address=IPNetwork('192.0.0.1/24'), prefix=self.prefixes[0]),
IPAddress(address=IPNetwork('192.0.2.1/24'), prefix=self.prefixes[1]),
IPAddress(address=IPNetwork('192.0.2.1/24'), vrf=self.vrf, prefix=self.prefixes[2]),
IPAddress(address=IPNetwork('2001:db8::/64'), prefix=self.prefixes[4]),
IPAddress(address=IPNetwork('2001:db8:2::/64'), prefix=self.prefixes[3]),
)
for ip in ips:
ip.clean()
ip.save()
self.assertEqual(ips[0].prefix, self.prefixes[0])
self.assertEqual(ips[1].prefix, self.prefixes[1])
self.assertEqual(ips[2].prefix, self.prefixes[2])
self.assertEqual(ips[3].prefix, self.prefixes[4])
self.assertEqual(ips[4].prefix, self.prefixes[3])
def test_parent_prefix_change(self):
ip = IPAddress(address=IPNetwork('192.0.1.1/24'), prefix=self.prefixes[0])
ip.clean()
ip.save()
prefix = Prefix(prefix='192.0.1.0/17')
prefix.clean()
prefix.save()
ip.refresh_from_db()
self.assertEqual(ip.prefix, prefix)
# TODO: Prefix Altered
# TODO: Prefix Deleted
# TODO: Prefix does not contain IP Address
# TODO: Prefix VRF does not match IP Address VRF
#
# Uniqueness enforcement tests
#
@@ -853,20 +559,13 @@ class TestIPAddress(TestCase):
self.assertRaises(ValidationError, duplicate_ip.clean)
def test_duplicate_vrf(self):
vrf = VRF.objects.get(rd='1:1')
vrf.enforce_unique = False
vrf.clean()
vrf.save()
vrf = VRF.objects.create(name='Test', rd='1:1', enforce_unique=False)
IPAddress.objects.create(vrf=vrf, address=IPNetwork('192.0.2.1/24'))
duplicate_ip = IPAddress(vrf=vrf, address=IPNetwork('192.0.2.1/24'))
self.assertIsNone(duplicate_ip.clean())
def test_duplicate_vrf_unique(self):
vrf = VRF.objects.get(rd='1:1')
vrf.enforce_unique = True
vrf.clean()
vrf.save()
vrf = VRF.objects.create(name='Test', rd='1:1', enforce_unique=True)
IPAddress.objects.create(vrf=vrf, address=IPNetwork('192.0.2.1/24'))
duplicate_ip = IPAddress(vrf=vrf, address=IPNetwork('192.0.2.1/24'))
self.assertRaises(ValidationError, duplicate_ip.clean)

View File

@@ -421,7 +421,6 @@ class PrefixTestCase(ViewTestCases.PrimaryObjectViewTestCase):
tags = create_tags('Alpha', 'Bravo', 'Charlie')
# TODO: Alter for prefix
cls.form_data = {
'prefix': IPNetwork('192.0.2.0/24'),
'scope_type': ContentType.objects.get_for_model(Site).pk,
@@ -437,7 +436,6 @@ class PrefixTestCase(ViewTestCases.PrimaryObjectViewTestCase):
}
site = sites[0].pk
# TODO: Alter for prefix
cls.csv_data = (
"vrf,prefix,status,scope_type,scope_id",
f"VRF 1,10.4.0.0/16,active,dcim.site,{site}",
@@ -445,7 +443,6 @@ class PrefixTestCase(ViewTestCases.PrimaryObjectViewTestCase):
f"VRF 1,10.6.0.0/16,active,dcim.site,{site}",
)
# TODO: Alter for prefix
cls.csv_update_data = (
"id,description,status",
f"{prefixes[0].pk},New description 7,{PrefixStatusChoices.STATUS_RESERVED}",
@@ -453,7 +450,6 @@ class PrefixTestCase(ViewTestCases.PrimaryObjectViewTestCase):
f"{prefixes[2].pk},New description 9,{PrefixStatusChoices.STATUS_RESERVED}",
)
# TODO: Alter for prefix
cls.bulk_edit_data = {
'vrf': vrfs[1].pk,
'tenant': None,
@@ -481,9 +477,9 @@ class PrefixTestCase(ViewTestCases.PrimaryObjectViewTestCase):
def test_prefix_ipranges(self):
prefix = Prefix.objects.create(prefix=IPNetwork('192.168.0.0/16'))
ip_ranges = (
IPRange(prefix=prefix, start_address='192.168.0.1/24', end_address='192.168.0.100/24', size=99),
IPRange(prefix=prefix, start_address='192.168.1.1/24', end_address='192.168.1.100/24', size=99),
IPRange(prefix=prefix, start_address='192.168.2.1/24', end_address='192.168.2.100/24', size=99),
IPRange(start_address='192.168.0.1/24', end_address='192.168.0.100/24', size=99),
IPRange(start_address='192.168.1.1/24', end_address='192.168.1.100/24', size=99),
IPRange(start_address='192.168.2.1/24', end_address='192.168.2.100/24', size=99),
)
IPRange.objects.bulk_create(ip_ranges)
self.assertEqual(prefix.get_child_ranges().count(), 3)
@@ -495,12 +491,12 @@ class PrefixTestCase(ViewTestCases.PrimaryObjectViewTestCase):
def test_prefix_ipaddresses(self):
prefix = Prefix.objects.create(prefix=IPNetwork('192.168.0.0/16'))
ip_addresses = (
IPAddress(prefix=prefix, address=IPNetwork('192.168.0.1/16')),
IPAddress(prefix=prefix, address=IPNetwork('192.168.0.2/16')),
IPAddress(prefix=prefix, address=IPNetwork('192.168.0.3/16')),
IPAddress(address=IPNetwork('192.168.0.1/16')),
IPAddress(address=IPNetwork('192.168.0.2/16')),
IPAddress(address=IPNetwork('192.168.0.3/16')),
)
IPAddress.objects.bulk_create(ip_addresses)
self.assertEqual(prefix.ip_addresses.all().count(), 3)
self.assertEqual(prefix.get_child_ips().count(), 3)
url = reverse('ipam:prefix_ipaddresses', kwargs={'pk': prefix.pk})
self.assertHttpStatus(self.client.get(url), 200)
@@ -568,6 +564,82 @@ vlan: 102
self.assertEqual(prefix.vlan.vid, 102)
self.assertEqual(prefix.scope, site)
@override_settings(EXEMPT_VIEW_PERMISSIONS=['*'])
def test_prefix_import_with_vlan_site_multiple_vlans_same_vid(self):
"""
Test import when multiple VLANs exist with the same vid but different sites.
Ref: #20560
"""
site1 = Site.objects.get(name='Site 1')
site2 = Site.objects.get(name='Site 2')
# Create VLANs with the same vid but different sites
vlan1 = VLAN.objects.create(vid=1, name='VLAN1-Site1', site=site1)
VLAN.objects.create(vid=1, name='VLAN1-Site2', site=site2) # Create ambiguity
# Import prefix with vlan_site specified
IMPORT_DATA = f"""
prefix: 10.11.0.0/22
status: active
scope_type: dcim.site
scope_id: {site1.pk}
vlan_site: {site1.name}
vlan: 1
description: LOC02-MGMT
"""
# Add all required permissions to the test user
self.add_permissions('ipam.view_prefix', 'ipam.add_prefix')
form_data = {
'data': IMPORT_DATA,
'format': 'yaml'
}
response = self.client.post(reverse('ipam:prefix_bulk_import'), data=form_data, follow=True)
self.assertHttpStatus(response, 200)
# Verify the prefix was created with the correct VLAN
prefix = Prefix.objects.get(prefix='10.11.0.0/22')
self.assertEqual(prefix.vlan, vlan1)
@override_settings(EXEMPT_VIEW_PERMISSIONS=['*'])
def test_prefix_import_with_vlan_site_and_global_vlan(self):
"""
Test import when a global VLAN (no site) and site-specific VLAN exist with same vid.
When vlan_site is specified, should prefer the site-specific VLAN.
Ref: #20560
"""
site1 = Site.objects.get(name='Site 1')
# Create a global VLAN (no site) and a site-specific VLAN with the same vid
VLAN.objects.create(vid=10, name='VLAN10-Global', site=None) # Create ambiguity
vlan_site = VLAN.objects.create(vid=10, name='VLAN10-Site1', site=site1)
# Import prefix with vlan_site specified
IMPORT_DATA = f"""
prefix: 10.12.0.0/22
status: active
scope_type: dcim.site
scope_id: {site1.pk}
vlan_site: {site1.name}
vlan: 10
description: Test Site-Specific VLAN
"""
# Add all required permissions to the test user
self.add_permissions('ipam.view_prefix', 'ipam.add_prefix')
form_data = {
'data': IMPORT_DATA,
'format': 'yaml'
}
response = self.client.post(reverse('ipam:prefix_bulk_import'), data=form_data, follow=True)
self.assertHttpStatus(response, 200)
# Verify the prefix was created with the site-specific VLAN (not the global one)
prefix = Prefix.objects.get(prefix='10.12.0.0/22')
self.assertEqual(prefix.vlan, vlan_site)
class IPRangeTestCase(ViewTestCases.PrimaryObjectViewTestCase):
model = IPRange
@@ -598,7 +670,6 @@ class IPRangeTestCase(ViewTestCases.PrimaryObjectViewTestCase):
tags = create_tags('Alpha', 'Bravo', 'Charlie')
# TODO: Alter for prefix
cls.form_data = {
'start_address': IPNetwork('192.0.5.10/24'),
'end_address': IPNetwork('192.0.5.100/24'),
@@ -612,7 +683,6 @@ class IPRangeTestCase(ViewTestCases.PrimaryObjectViewTestCase):
'tags': [t.pk for t in tags],
}
# TODO: Alter for prefix
cls.csv_data = (
"vrf,start_address,end_address,status",
"VRF 1,10.1.0.1/16,10.1.9.254/16,active",
@@ -620,7 +690,6 @@ class IPRangeTestCase(ViewTestCases.PrimaryObjectViewTestCase):
"VRF 1,10.3.0.1/16,10.3.9.254/16,active",
)
# TODO: Alter for prefix
cls.csv_update_data = (
"id,description,status",
f"{ip_ranges[0].pk},New description 7,{IPRangeStatusChoices.STATUS_RESERVED}",
@@ -628,7 +697,6 @@ class IPRangeTestCase(ViewTestCases.PrimaryObjectViewTestCase):
f"{ip_ranges[2].pk},New description 9,{IPRangeStatusChoices.STATUS_RESERVED}",
)
# TODO: Alter for prefix
cls.bulk_edit_data = {
'vrf': vrfs[1].pk,
'tenant': None,
@@ -695,7 +763,6 @@ class IPAddressTestCase(ViewTestCases.PrimaryObjectViewTestCase):
),
)
FHRPGroup.objects.bulk_create(fhrp_groups)
# TODO: Alter for prefix
cls.form_data = {
'vrf': vrfs[1].pk,
'address': IPNetwork('192.0.2.99/24'),
@@ -708,7 +775,6 @@ class IPAddressTestCase(ViewTestCases.PrimaryObjectViewTestCase):
'tags': [t.pk for t in tags],
}
# TODO: Alter for prefix
cls.csv_data = (
"vrf,address,status,fhrp_group",
"VRF 1,192.0.2.4/24,active,FHRP Group 1",
@@ -716,7 +782,6 @@ class IPAddressTestCase(ViewTestCases.PrimaryObjectViewTestCase):
"VRF 1,192.0.2.6/24,active,FHRP Group 3",
)
# TODO: Alter for prefix
cls.csv_update_data = (
"id,description,status",
f"{ipaddresses[0].pk},New description 7,{IPAddressStatusChoices.STATUS_RESERVED}",
@@ -724,7 +789,6 @@ class IPAddressTestCase(ViewTestCases.PrimaryObjectViewTestCase):
f"{ipaddresses[2].pk},New description 9,{IPAddressStatusChoices.STATUS_RESERVED}",
)
# TODO: Alter for prefix
cls.bulk_edit_data = {
'vrf': vrfs[1].pk,
'tenant': None,

View File

@@ -1,91 +0,0 @@
ipam_prefix_delete_adjust_prefix_parent = """
-- Update Child Prefix's with Prefix's PARENT This is a safe assumption based on the fact that the parent would be the
-- next direct parent for anything else that could contain this prefix
UPDATE ipam_prefix SET parent_id=OLD.parent_id WHERE parent_id=OLD.id;
RETURN OLD;
"""
ipam_prefix_insert_adjust_prefix_parent = """
-- Update the prefix with the new parent if the parent is the most appropriate prefix
UPDATE ipam_prefix
SET parent_id=NEW.id
WHERE
prefix << NEW.prefix
AND
(
(vrf_id = NEW.vrf_id OR (vrf_id IS NULL AND NEW.vrf_id IS NULL))
OR
(
NEW.vrf_id IS NULL
AND
NEW.status = 'container'
AND
NOT EXISTS(
SELECT 1 FROM ipam_prefix p WHERE p.prefix >> ipam_prefix.prefix AND p.vrf_id = ipam_prefix.vrf_id
)
)
)
AND id != NEW.id
AND NOT EXISTS (
SELECT 1 FROM ipam_prefix p
WHERE
p.prefix >> ipam_prefix.prefix
AND p.prefix << NEW.prefix
AND (
(p.vrf_id = ipam_prefix.vrf_id OR (p.vrf_id IS NULL AND ipam_prefix.vrf_id IS NULL))
OR
(p.vrf_id IS NULL AND p.status = 'container')
)
AND p.id != NEW.id
)
;
RETURN NEW;
"""
ipam_prefix_update_adjust_prefix_parent = """
-- When a prefix changes, reassign any IPAddresses that no longer
-- fall within the new prefix range to the parent prefix (or set null if no parent exists)
UPDATE ipam_prefix
SET parent_id = OLD.parent_id
WHERE
parent_id = NEW.id
-- IP address no longer contained within the updated prefix
AND NOT (prefix << NEW.prefix);
-- Update the prefix with the new parent if the parent is the most appropriate prefix
UPDATE ipam_prefix
SET parent_id=NEW.id
WHERE
prefix << NEW.prefix
AND
(
(vrf_id = NEW.vrf_id OR (vrf_id IS NULL AND NEW.vrf_id IS NULL))
OR
(
NEW.vrf_id IS NULL
AND
NEW.status = 'container'
AND
NOT EXISTS(
SELECT 1 FROM ipam_prefix p WHERE p.prefix >> ipam_prefix.prefix AND p.vrf_id = ipam_prefix.vrf_id
)
)
)
AND id != NEW.id
AND NOT EXISTS (
SELECT 1 FROM ipam_prefix p
WHERE
p.prefix >> ipam_prefix.prefix
AND p.prefix << NEW.prefix
AND (
(p.vrf_id = ipam_prefix.vrf_id OR (p.vrf_id IS NULL AND ipam_prefix.vrf_id IS NULL))
OR
(p.vrf_id IS NULL AND p.status = 'container')
)
AND p.id != NEW.id
)
;
RETURN NEW;
"""

View File

@@ -687,13 +687,13 @@ class PrefixIPAddressesView(generic.ObjectChildrenView):
template_name = 'ipam/prefix/ip_addresses.html'
tab = ViewTab(
label=_('IP Addresses'),
badge=lambda x: x.ip_addresses.count(),
badge=lambda x: x.get_child_ips().count(),
permission='ipam.view_ipaddress',
weight=700
)
def get_children(self, request, parent):
return parent.ip_addresses.restrict(request.user, 'view').prefetch_related('vrf', 'tenant', 'tenant__group')
return parent.get_child_ips().restrict(request.user, 'view').prefetch_related('vrf', 'tenant', 'tenant__group')
def prep_table_data(self, request, queryset, parent):
if not request.GET.get('q') and not get_table_ordering(request, self.table):

View File

@@ -80,22 +80,21 @@ class Config:
try:
# Enforce the creation date as the ordering parameter
revision = ConfigRevision.objects.get(active=True)
logger.debug(f"Loaded active configuration revision #{revision.pk}")
logger.debug(f"Loaded active configuration revision (#{revision.pk})")
except (ConfigRevision.DoesNotExist, ConfigRevision.MultipleObjectsReturned):
logger.debug("No active configuration revision found - falling back to most recent")
revision = ConfigRevision.objects.order_by('-created').first()
if revision is None:
logger.debug("No previous configuration found in database; proceeding with default values")
logger.debug("No configuration found in database; proceeding with default values")
return
logger.debug(f"Using fallback configuration revision #{revision.pk}")
logger.debug(f"No active configuration revision found; falling back to most recent (#{revision.pk})")
except DatabaseError:
# The database may not be available yet (e.g. when running a management command)
logger.warning("Skipping config initialization (database unavailable)")
return
revision.activate()
logger.debug("Filled cache with data from latest ConfigRevision")
revision.activate(update_db=False)
self._populate_from_cache()
logger.debug("Filled cache with data from latest ConfigRevision")
class ConfigItem:

View File

@@ -42,7 +42,9 @@ class NetBoxModelFilterSetForm(FilterModifierMixin, CustomFieldsMixin, SavedFilt
)
def _get_form_field(self, customfield):
return customfield.to_form_field(set_initial=False, enforce_required=False, enforce_visibility=False)
return customfield.to_form_field(
set_initial=False, enforce_required=False, enforce_visibility=False, for_filterset_form=True
)
class OwnerFilterMixin(forms.Form):

View File

@@ -288,12 +288,13 @@ class CustomFieldsMixin(models.Model):
cf.name: cf for cf in CustomField.objects.get_for_model(self)
}
# Remove any stale custom field data
self.custom_field_data = {
k: v for k, v in self.custom_field_data.items() if k in custom_fields.keys()
}
# Validate all field values
for field_name, value in self.custom_field_data.items():
if field_name not in custom_fields:
raise ValidationError(_("Unknown field name '{name}' in custom field data.").format(
name=field_name
))
try:
custom_fields[field_name].validate(value)
except ValidationError as e:

View File

@@ -1,3 +1,4 @@
from django.db.models import ForeignKey
from django.template import loader
from django.urls.exceptions import NoReverseMatch
from django.utils.translation import gettext_lazy as _
@@ -175,6 +176,21 @@ class BulkEdit(ObjectAction):
permissions_required = {'change'}
template_name = 'buttons/bulk_edit.html'
@classmethod
def get_context(cls, context, model):
url_params = super().get_url_params(context)
# If this is a child object, pass the parent's PK as a URL parameter
if parent := context.get('object'):
for field in model._meta.get_fields():
if isinstance(field, ForeignKey) and field.remote_field.model == parent.__class__:
url_params[field.name] = parent.pk
break
return {
'url_params': url_params,
}
class BulkRename(ObjectAction):
"""

View File

@@ -454,7 +454,6 @@ INSTALLED_APPS = [
'sorl.thumbnail',
'taggit',
'timezone_field',
'pgtrigger',
'core',
'account',
'circuits',
@@ -834,6 +833,7 @@ LANGUAGES = (
('fr', _('French')),
('it', _('Italian')),
('ja', _('Japanese')),
('lv', _('Latvian')),
('nl', _('Dutch')),
('pl', _('Polish')),
('pt', _('Portuguese')),

View File

@@ -1,11 +1,10 @@
from unittest import skipIf
from django.conf import settings
from django.test import TestCase
from django.test import RequestFactory, TestCase
from dcim.models import Device
from netbox.object_actions import AddObject, BulkImport
from netbox.tests.dummy_plugin.models import DummyNetBoxModel
from dcim.models import Device, DeviceType, Manufacturer
from netbox.object_actions import AddObject, BulkEdit, BulkImport
class ObjectActionTest(TestCase):
@@ -20,9 +19,11 @@ class ObjectActionTest(TestCase):
url = BulkImport.get_url(obj)
self.assertEqual(url, '/dcim/devices/import/')
@skipIf('netbox.tests.dummy_plugin' not in settings.PLUGINS, "dummy_plugin not in settings.PLUGINS")
@skipIf('netbox.tests.dummy_plugin' not in settings.PLUGINS, 'dummy_plugin not in settings.PLUGINS')
def test_get_url_plugin_model(self):
"""Test URL generation for plugin models includes plugins: namespace"""
from netbox.tests.dummy_plugin.models import DummyNetBoxModel
obj = DummyNetBoxModel()
url = AddObject.get_url(obj)
@@ -30,3 +31,29 @@ class ObjectActionTest(TestCase):
url = BulkImport.get_url(obj)
self.assertEqual(url, '/plugins/dummy-plugin/netboxmodel/import/')
def test_bulk_edit_get_context_child_object(self):
"""
Test that the parent object's PK is included in the context for child objects.
Ensure that BulkEdit.get_context() correctly identifies and
includes the parent object's PK when rendering a child object's
action button.
"""
manufacturer = Manufacturer.objects.create(name='Manufacturer 1', slug='manufacturer-1')
device_type = DeviceType.objects.create(manufacturer=manufacturer, model='Device Type 1', slug='device-type-1')
# Mock context containing the parent object (DeviceType)
request = RequestFactory().get('/')
context = {
'request': request,
'object': device_type,
}
# Get context for the child model (Device)
action_context = BulkEdit.get_context(context, Device)
# Verify that 'device_type' (the FK field name) is present in
# url_params with the parent's PK
self.assertIn('url_params', action_context)
self.assertEqual(action_context['url_params'].get('device_type'), device_type.pk)

View File

@@ -52,6 +52,7 @@ def handler_500(request, template_name=ERROR_500_TEMPLATE_NAME):
type_, error = sys.exc_info()[:2]
return HttpResponseServerError(template.render({
'request': request,
'error': error,
'exception': str(type_),
'netbox_version': settings.RELEASE.full_version,

View File

@@ -1,5 +1,6 @@
import logging
import re
from collections import Counter
from copy import deepcopy
from django.contrib import messages
@@ -33,6 +34,7 @@ from utilities.jobs import is_background_request, process_request_as_job
from utilities.permissions import get_permission_for_model
from utilities.query import reapply_model_ordering
from utilities.request import safe_for_redirect
from utilities.string import title
from utilities.tables import get_table_configs
from utilities.views import GetReturnURLMixin, get_action_url
from .base import BaseMultiObjectView
@@ -443,6 +445,18 @@ class BulkImportView(GetReturnURLMixin, BaseMultiObjectView):
# Prefetch objects to be updated, if any
prefetch_ids = [int(record['id']) for record in records if record.get('id')]
# check for duplicate IDs
duplicate_pks = [pk for pk, count in Counter(prefetch_ids).items() if count > 1]
if duplicate_pks:
error_msg = _(
"Duplicate objects found: {model} with ID(s) {ids} appears multiple times"
).format(
model=title(self.queryset.model._meta.verbose_name),
ids=', '.join(str(pk) for pk in sorted(duplicate_pks))
)
raise ValidationError(error_msg)
prefetched_objects = {
obj.pk: obj
for obj in self.queryset.model.objects.filter(id__in=prefetch_ids)

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -28,10 +28,10 @@
"bootstrap": "5.3.8",
"clipboard": "2.0.11",
"flatpickr": "4.6.13",
"gridstack": "12.4.1",
"gridstack": "12.4.2",
"htmx.org": "2.0.8",
"query-string": "9.3.1",
"sass": "1.97.0",
"sass": "1.97.2",
"tom-select": "2.4.3",
"typeface-inter": "3.18.1",
"typeface-roboto-mono": "1.1.13"

View File

@@ -28,13 +28,27 @@ function updateElements(targetMode: ColorMode): void {
}
for (const elevation of getElements<HTMLObjectElement>('.rack_elevation')) {
const svg = elevation.contentDocument?.querySelector('svg') ?? null;
if (svg !== null) {
const svg = elevation.firstElementChild ?? null;
if (svg !== null && svg.nodeName == 'svg') {
svg.setAttribute(`data-bs-theme`, targetMode);
}
}
}
/**
* Set the color mode to light of elevations after an htmx call.
* Pulls current color mode from document
*
* @param event htmx listener event details. See: https://htmx.org/events/#htmx:afterSwap
*/
function updateElevations(evt: CustomEvent, ): void {
const swappedElement = evt.detail.elt
if (swappedElement.nodeName == 'svg') {
const currentMode = localStorage.getItem(COLOR_MODE_KEY);
swappedElement.setAttribute('data-bs-theme', currentMode)
}
}
/**
* Call all functions necessary to update the color mode across the UI.
*
@@ -115,6 +129,7 @@ function initColorModeToggle(): void {
*/
export function initColorMode(): void {
window.addEventListener('load', defaultColorMode);
window.addEventListener('htmx:afterSwap', updateElevations as EventListener); // Uses a custom event from HTMX
for (const func of [initColorModeToggle]) {
func();
}

View File

@@ -36,7 +36,6 @@ form.object-edit {
// Make optgroup labels sticky when scrolling through select elements
select[multiple] {
optgroup {
position: sticky;
top: 0;
background-color: var(--bs-body-bg);
font-style: normal;

View File

@@ -2304,10 +2304,10 @@ graphql@16.12.0:
resolved "https://registry.npmjs.org/graphql/-/graphql-16.12.0.tgz"
integrity sha512-DKKrynuQRne0PNpEbzuEdHlYOMksHSUI8Zc9Unei5gTsMNA2/vMpoMz/yKba50pejK56qj98qM0SjYxAKi13gQ==
gridstack@12.4.1:
version "12.4.1"
resolved "https://registry.yarnpkg.com/gridstack/-/gridstack-12.4.1.tgz#4a44511e5da33016e731f00bee279bed550d4ab9"
integrity sha512-dYBNVEDw2zwnz0bCDouHk8rMclrMoMn4r6rtNyyWSeYsV3RF8QV2KFRTj4c86T2FsZPr3iQv+/LD/ae29FcpHQ==
gridstack@12.4.2:
version "12.4.2"
resolved "https://registry.yarnpkg.com/gridstack/-/gridstack-12.4.2.tgz#188de180b6cda77e48b1414aac1d778a38f48f04"
integrity sha512-aXbJrQpi3LwpYXYOr4UriPM5uc/dPcjK01SdOE5PDpx2vi8tnLhU7yBg/1i4T59UhNkG/RBfabdFUObuN+gMnw==
has-bigints@^1.0.1, has-bigints@^1.0.2:
version "1.0.2"
@@ -3251,10 +3251,10 @@ safe-regex-test@^1.1.0:
es-errors "^1.3.0"
is-regex "^1.2.1"
sass@1.97.0:
version "1.97.0"
resolved "https://registry.yarnpkg.com/sass/-/sass-1.97.0.tgz#8ed65df5e2f73012d5ef0e98837ff63550657ab2"
integrity sha512-KR0igP1z4avUJetEuIeOdDlwaUDvkH8wSx7FdSjyYBS3dpyX3TzHfAMO0G1Q4/3cdjcmi3r7idh+KCmKqS+KeQ==
sass@1.97.2:
version "1.97.2"
resolved "https://registry.yarnpkg.com/sass/-/sass-1.97.2.tgz#e515a319092fd2c3b015228e3094b40198bff0da"
integrity sha512-y5LWb0IlbO4e97Zr7c3mlpabcbBtS+ieiZ9iwDooShpFKWXf62zz5pEPdwrLYm+Bxn1fnbwFGzHuCLSA9tBmrw==
dependencies:
chokidar "^4.0.0"
immutable "^5.0.2"

View File

@@ -1,4 +1,3 @@
version: "4.5.0"
edition: "Community"
published: "2025-12-16"
designation: "beta1"
published: "2026-01-06"

View File

@@ -35,6 +35,12 @@
{% trans "Plugins" %}: {% for plugin, version in plugins.items %}
{{ plugin }}: {{ version }}{% empty %}{% trans "None installed" %}{% endfor %}
</pre>
<p>
{% trans "The request which yielded the above error is shown below:" %}
</p>
<p>
<code>{{ request.method }} {{ request.build_absolute_uri }}</code>
</p>
<p>
{% trans "If further assistance is required, please post to the" %} <a href="https://github.com/netbox-community/netbox/discussions">{% trans "NetBox discussion forum" %}</a> {% trans "on GitHub" %}.
</p>

View File

@@ -112,6 +112,19 @@
<th scope="row">{% trans "Bridge" %}</th>
<td>{{ object.bridge|linkify|placeholder }}</td>
</tr>
<tr>
<th scope="row">{% trans "Bridged Interfaces" %}</th>
<td>
{% if bridge_interfaces %}
{% for interface in bridge_interfaces %}
{{ interface|linkify }}
{% if not forloop.last %}<br />{% endif %}
{% endfor %}
{% else %}
{{ ''|placeholder }}
{% endif %}
</td>
</tr>
<tr>
<th scope="row">{% trans "LAG" %}</th>
<td>{{ object.lag|linkify|placeholder }}</td>
@@ -435,13 +448,11 @@
</div>
</div>
{% endif %}
{% if object.is_bridge %}
<div class="row mb-3">
<div class="col col-md-12">
{% include 'inc/panel_table.html' with table=bridge_interfaces_table heading="Bridge Interfaces" %}
</div>
<div class="row mb-3">
<div class="col col-md-12">
{% include 'inc/panel_table.html' with table=bridge_interfaces_table heading="Bridged Interfaces" %}
</div>
{% endif %}
</div>
<div class="row mb-3">
<div class="col col-md-12">
{% include 'inc/panel_table.html' with table=child_interfaces_table heading="Child Interfaces" %}

View File

@@ -23,7 +23,7 @@
</tr>
<tr>
<th scope="row">{% trans "Range" %}</th>
<td>{{ object.range_as_string }}</td>
<td>{{ object.range_as_string_with_asdot }}</td>
</tr>
<tr>
<th scope="row">{% trans "Tenant" %}</th>

View File

@@ -14,10 +14,6 @@
<th scope="row">{% trans "Family" %}</th>
<td>IPv{{ object.family }}</td>
</tr>
<tr>
<th scope="row">{% trans "Prefix" %}</th>
<td>{{ object.prefix|linkify|placeholder }}</td>
</tr>
<tr>
<th scope="row">{% trans "VRF" %}</th>
<td>

View File

@@ -14,7 +14,6 @@
<div class="row">
<h2 class="col-9 offset-3">{% trans "IP Addresses" %}</h2>
</div>
{% render_field model_form.prefix %}
{% render_field form.pattern %}
{% render_field model_form.status %}
{% render_field model_form.role %}

View File

@@ -13,10 +13,6 @@
<th scope="row">{% trans "Family" %}</th>
<td>IPv{{ object.family }}</td>
</tr>
<tr>
<th scope="row">{% trans "Prefix" %}</th>
<td>{{ object.prefix|linkify|placeholder }}</td>
</tr>
<tr>
<th scope="row">{% trans "Starting Address" %}</th>
<td>{{ object.start_address }}</td>

View File

@@ -109,7 +109,7 @@
{% endif %}
</td>
</tr>
{% with child_ip_count=object.ip_addresses.count %}
{% with child_ip_count=object.get_child_ips.count %}
<tr>
<th scope="row">{% trans "Child IPs" %}</th>
<td>

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

Some files were not shown because too many files have changed in this diff Show More