mirror of
https://github.com/netbox-community/netbox.git
synced 2026-02-10 10:57:43 +01:00
Compare commits
100 Commits
v4.4.0-bet
...
v4.4.1
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
8fa1abd371 | ||
|
|
81401b9e17 | ||
|
|
5bfbca9a83 | ||
|
|
85689b25de | ||
|
|
c2aa87a4c9 | ||
|
|
34b111bdc4 | ||
|
|
684106031a | ||
|
|
31644b4ce6 | ||
|
|
fb004bb94e | ||
|
|
192440a4d3 | ||
|
|
2dac09cea0 | ||
|
|
2a99aadc5d | ||
|
|
2d6b3d19e7 | ||
|
|
103939ad3c | ||
|
|
4b17faae52 | ||
|
|
37644eed3f | ||
|
|
cf0ef92268 | ||
|
|
77376524f9 | ||
|
|
53d1b1aa50 | ||
|
|
d172e6210b | ||
|
|
cd122a7dde | ||
|
|
d1e40281f3 | ||
|
|
be4db9a899 | ||
|
|
01f1228e3b | ||
|
|
c57d9f9a37 | ||
|
|
6f01da90b4 | ||
|
|
bf7356473c | ||
|
|
a99e21afd6 | ||
|
|
0e627d4d9b | ||
|
|
1034f738af | ||
|
|
873372f61e | ||
|
|
1d9d7f2d84 | ||
|
|
83fe973fea | ||
|
|
8ebc677372 | ||
|
|
9d0e80571c | ||
|
|
291010737a | ||
|
|
b24f8fb340 | ||
|
|
a611ade5d3 | ||
|
|
099f3b2f34 | ||
|
|
1b83d32f4a | ||
|
|
af6f4ce3ab | ||
|
|
d2c0026b9d | ||
|
|
1eeede0931 | ||
|
|
c3b37db8f7 | ||
|
|
c9dc2005b0 | ||
|
|
c9f823167c | ||
|
|
5ca2cea016 | ||
|
|
026737b62b | ||
|
|
94faf58c27 | ||
|
|
de499ca686 | ||
|
|
f04a2b965f | ||
|
|
fcb380b5c5 | ||
|
|
8311f457b5 | ||
|
|
2ba2864a6a | ||
|
|
47e4947ca0 | ||
|
|
545773e221 | ||
|
|
f9159ad9bd | ||
|
|
2ddec1ef48 | ||
|
|
309e434064 | ||
|
|
8a1db81111 | ||
|
|
399d51b466 | ||
|
|
6135fb8cd7 | ||
|
|
0a336465f2 | ||
|
|
ea50786b5c | ||
|
|
d8822c8bca | ||
|
|
319556a747 | ||
|
|
d433456e2f | ||
|
|
8f8ca805c4 | ||
|
|
133918321a | ||
|
|
6e6c02f98c | ||
|
|
44dae99205 | ||
|
|
57bb7c0a8e | ||
|
|
29ea88eb94 | ||
|
|
2d339033e2 | ||
|
|
08ae139161 | ||
|
|
1c1073e160 | ||
|
|
0870ec6eb8 | ||
|
|
81579b6739 | ||
|
|
b334931513 | ||
|
|
704f0507e7 | ||
|
|
122e2d13dd | ||
|
|
0c3beec3a2 | ||
|
|
758be46a6f | ||
|
|
5ac3e79e7b | ||
|
|
7033230388 | ||
|
|
66140fc017 | ||
|
|
d5e49c8cb0 | ||
|
|
6b3b4b3193 | ||
|
|
2e809904fa | ||
|
|
8b397f3b42 | ||
|
|
7bbb04d2d3 | ||
|
|
f2b29273d0 | ||
|
|
92fba0bed4 | ||
|
|
53c890c081 | ||
|
|
db1786c385 | ||
|
|
a59da37ac3 | ||
|
|
9580ac2946 | ||
|
|
a9ada4457b | ||
|
|
9f605a2db1 | ||
|
|
44f173f01d |
@@ -15,7 +15,7 @@ body:
|
||||
attributes:
|
||||
label: NetBox version
|
||||
description: What version of NetBox are you currently running?
|
||||
placeholder: v4.3.6
|
||||
placeholder: v4.4.1
|
||||
validations:
|
||||
required: true
|
||||
- type: dropdown
|
||||
|
||||
18
.github/ISSUE_TEMPLATE/02-bug_report.yaml
vendored
18
.github/ISSUE_TEMPLATE/02-bug_report.yaml
vendored
@@ -8,26 +8,26 @@ body:
|
||||
attributes:
|
||||
value: >
|
||||
**NOTE:** This form is only for reporting _reproducible bugs_ in a current NetBox
|
||||
installation. If you're having trouble with installation or just looking for
|
||||
assistance with using NetBox, please visit our
|
||||
release. If you're having trouble with installation or just looking for assistance
|
||||
using NetBox, please visit our
|
||||
[discussion forum](https://github.com/netbox-community/netbox/discussions) instead.
|
||||
- type: dropdown
|
||||
attributes:
|
||||
label: Deployment Type
|
||||
label: NetBox Edition
|
||||
description: >
|
||||
How are you running NetBox? (For issues with the Docker image, please go to the
|
||||
[netbox-docker](https://github.com/netbox-community/netbox-docker) repo.)
|
||||
Users of [NetBox Cloud](https://netboxlabs.com/netbox-cloud/) or
|
||||
[NetBox Enterprise](https://netboxlabs.com/netbox-enterprise/), please contact the
|
||||
[NetBox Labs](https://netboxlabs.com/) support team for assistance to ensure your
|
||||
request receives immediate attention.
|
||||
options:
|
||||
- NetBox Cloud
|
||||
- NetBox Enterprise
|
||||
- Self-hosted
|
||||
- NetBox Community
|
||||
validations:
|
||||
required: true
|
||||
- type: input
|
||||
attributes:
|
||||
label: NetBox Version
|
||||
description: What version of NetBox are you currently running?
|
||||
placeholder: v4.3.6
|
||||
placeholder: v4.4.1
|
||||
validations:
|
||||
required: true
|
||||
- type: dropdown
|
||||
|
||||
3
.github/ISSUE_TEMPLATE/config.yml
vendored
3
.github/ISSUE_TEMPLATE/config.yml
vendored
@@ -13,9 +13,6 @@ contact_links:
|
||||
- name: 🌎 Correct a Translation
|
||||
url: https://explore.transifex.com/netbox-community/netbox/
|
||||
about: "Spot an incorrect translation? You can propose a fix on Transifex."
|
||||
- name: 💡 Plugin Idea
|
||||
url: https://plugin-ideas.netbox.dev
|
||||
about: "Have an idea for a plugin? Head over to the ideas board!"
|
||||
- name: 💬 Community Slack
|
||||
url: https://netdev.chat
|
||||
about: "Join #netbox on the NetDev Community Slack for assistance with installation issues and other problems."
|
||||
|
||||
@@ -21,6 +21,14 @@ repos:
|
||||
language: system
|
||||
pass_filenames: false
|
||||
types: [python]
|
||||
- id: openapi-check
|
||||
name: "Validate OpenAPI schema"
|
||||
description: "Check for any unexpected changes to the OpenAPI schema"
|
||||
files: api/.*\.py$
|
||||
entry: scripts/verify-openapi.sh
|
||||
language: system
|
||||
pass_filenames: false
|
||||
types: [python]
|
||||
- id: mkdocs-build
|
||||
name: "Build documentation"
|
||||
description: "Build the documentation with mkdocs"
|
||||
|
||||
@@ -91,7 +91,6 @@ NetBox automatically logs the creation, modification, and deletion of all manage
|
||||
* Join the conversation on [the discussion forum](https://github.com/netbox-community/netbox/discussions) and [Slack](https://netdev.chat/)!
|
||||
* Already a power user? You can [suggest a feature](https://github.com/netbox-community/netbox/issues/new?assignees=&labels=type%3A+feature&template=feature_request.yaml) or [report a bug](https://github.com/netbox-community/netbox/issues/new?assignees=&labels=type%3A+bug&template=bug_report.yaml) on GitHub.
|
||||
* Contributions from the community are encouraged and appreciated! Check out our [contributing guide](CONTRIBUTING.md) to get started.
|
||||
* [Share your idea](https://plugin-ideas.netbox.dev/) for a new plugin, or [learn how to build one](https://github.com/netbox-community/netbox-plugin-tutorial) yourself!
|
||||
|
||||
## Screenshots
|
||||
|
||||
|
||||
@@ -34,4 +34,4 @@ For any security concerns regarding the community-maintained Docker image for Ne
|
||||
|
||||
### Bug Bounties
|
||||
|
||||
As NetBox is provided as free open source software, we do not offer any monetary compensation for vulnerability or bug reports, however your contributions are greatly appreciated.
|
||||
As NetBox is provided as free open source software, we do not offer any monetary compensation for vulnerability or bug reports; however, your contributions are greatly appreciated.
|
||||
|
||||
@@ -106,7 +106,11 @@ mkdocs-material
|
||||
|
||||
# Introspection for embedded code
|
||||
# https://github.com/mkdocstrings/mkdocstrings/blob/main/CHANGELOG.md
|
||||
mkdocstrings[python]
|
||||
mkdocstrings
|
||||
|
||||
# Python handler for mkdocstrings
|
||||
# https://github.com/mkdocstrings/python/blob/main/CHANGELOG.md
|
||||
mkdocstrings-python
|
||||
|
||||
# Library for manipulating IP prefixes and addresses
|
||||
# https://github.com/netaddr/netaddr/blob/master/CHANGELOG.rst
|
||||
|
||||
@@ -330,14 +330,87 @@
|
||||
"100base-lfx",
|
||||
"100base-tx",
|
||||
"100base-t1",
|
||||
"1000base-t",
|
||||
"1000base-bx10-d",
|
||||
"1000base-bx10-u",
|
||||
"1000base-cx",
|
||||
"1000base-cwdm",
|
||||
"1000base-dwdm",
|
||||
"1000base-ex",
|
||||
"1000base-sx",
|
||||
"1000base-lsx",
|
||||
"1000base-lx",
|
||||
"1000base-lx10",
|
||||
"1000base-t",
|
||||
"1000base-tx",
|
||||
"1000base-zx",
|
||||
"2.5gbase-t",
|
||||
"5gbase-t",
|
||||
"10gbase-t",
|
||||
"10gbase-br-d",
|
||||
"10gbase-br-u",
|
||||
"10gbase-cx4",
|
||||
"10gbase-er",
|
||||
"10gbase-lr",
|
||||
"10gbase-lrm",
|
||||
"10gbase-lx4",
|
||||
"10gbase-sr",
|
||||
"10gbase-t",
|
||||
"10gbase-zr",
|
||||
"25gbase-cr",
|
||||
"25gbase-er",
|
||||
"25gbase-lr",
|
||||
"25gbase-sr",
|
||||
"25gbase-t",
|
||||
"40gbase-cr4",
|
||||
"40gbase-er4",
|
||||
"40gbase-fr4",
|
||||
"40gbase-lr4",
|
||||
"40gbase-sr4",
|
||||
"50gbase-cr",
|
||||
"50gbase-er",
|
||||
"50gbase-fr",
|
||||
"50gbase-lr",
|
||||
"50gbase-sr",
|
||||
"100gbase-cr1",
|
||||
"100gbase-cr2",
|
||||
"100gbase-cr4",
|
||||
"100gbase-cr10",
|
||||
"100gbase-dr",
|
||||
"100gbase-er4",
|
||||
"100gbase-fr1",
|
||||
"100gbase-lr1",
|
||||
"100gbase-lr4",
|
||||
"100gbase-sr1",
|
||||
"100gbase-sr1.2",
|
||||
"100gbase-sr2",
|
||||
"100gbase-sr4",
|
||||
"100gbase-sr10",
|
||||
"100gbase-zr",
|
||||
"200gbase-cr2",
|
||||
"200gbase-cr4",
|
||||
"200gbase-sr2",
|
||||
"200gbase-sr4",
|
||||
"200gbase-dr4",
|
||||
"200gbase-er4",
|
||||
"200gbase-fr4",
|
||||
"200gbase-lr4",
|
||||
"200gbase-vr2",
|
||||
"400gbase-cr4",
|
||||
"400gbase-dr4",
|
||||
"400gbase-er8",
|
||||
"400gbase-fr4",
|
||||
"400gbase-fr8",
|
||||
"400gbase-lr4",
|
||||
"400gbase-lr8",
|
||||
"400gbase-sr4",
|
||||
"400gbase-sr4_2",
|
||||
"400gbase-sr8",
|
||||
"400gbase-sr16",
|
||||
"400gbase-vr4",
|
||||
"400gbase-zr",
|
||||
"800gbase-cr8",
|
||||
"800gbase-dr8",
|
||||
"800gbase-sr8",
|
||||
"800gbase-vr8",
|
||||
"100base-x-sfp",
|
||||
"1000base-x-gbic",
|
||||
"1000base-x-sfp",
|
||||
|
||||
257292
contrib/openapi.json
Normal file
257292
contrib/openapi.json
Normal file
File diff suppressed because one or more lines are too long
@@ -25,7 +25,7 @@ Once finished, make note of the application (client) ID; this will be used when
|
||||

|
||||
|
||||
!!! tip "Multitenant authentication"
|
||||
NetBox also supports multitenant authentication via Azure AD, however it requires a different backend and an additional configuration parameter. Please see the [`python-social-auth` documentation](https://python-social-auth.readthedocs.io/en/latest/backends/azuread.html#tenant-support) for details concerning multitenant authentication.
|
||||
NetBox also supports multitenant authentication via Azure AD; however, it requires a different backend and an additional configuration parameter. Please see the [`python-social-auth` documentation](https://python-social-auth.readthedocs.io/en/latest/backends/azuread.html#tenant-support) for details concerning multitenant authentication.
|
||||
|
||||
### 3. Create a secret
|
||||
|
||||
|
||||
@@ -4,7 +4,7 @@
|
||||
|
||||
### Enabling Error Reporting
|
||||
|
||||
NetBox supports native integration with [Sentry](https://sentry.io/) for automatic error reporting. To enable this functionality, set `SENTRY_ENABLED` to True and define your unique [data source name (DSN)](https://docs.sentry.io/product/sentry-basics/concepts/dsn-explainer/) in `configuration.py`.
|
||||
NetBox supports native integration with [Sentry](https://sentry.io/) for automatic error reporting. To enable this functionality, set `SENTRY_ENABLED` to `True` and define your unique [data source name (DSN)](https://docs.sentry.io/product/sentry-basics/concepts/dsn-explainer/) in `configuration.py`.
|
||||
|
||||
```python
|
||||
SENTRY_ENABLED = True
|
||||
|
||||
@@ -106,7 +106,7 @@ This approach can span multiple levels of relations. For example, the following
|
||||
```
|
||||
|
||||
!!! note
|
||||
While the above query is functional, it's not very efficient. There are ways to optimize such requests, however they are out of scope for this document. For more information, see the [Django queryset method reference](https://docs.djangoproject.com/en/stable/ref/models/querysets/) documentation.
|
||||
While the above query is functional, it's not very efficient. There are ways to optimize such requests; however, they are out of scope for this document. For more information, see the [Django queryset method reference](https://docs.djangoproject.com/en/stable/ref/models/querysets/) documentation.
|
||||
|
||||
Reverse relationships can be traversed as well. For example, the following will find all devices with an interface named "em0":
|
||||
|
||||
|
||||
@@ -17,7 +17,7 @@ CUSTOM_VALIDATORS = {
|
||||
},
|
||||
"my_plugin.validators.Validator1"
|
||||
],
|
||||
"dim.device": [
|
||||
"dcim.device": [
|
||||
"my_plugin.validators.Validator1"
|
||||
]
|
||||
}
|
||||
|
||||
@@ -257,6 +257,46 @@ The specific configuration settings for each storage backend can be found in the
|
||||
!!! note
|
||||
Any keys defined in the `STORAGES` configuration parameter replace those in the default configuration. It is only necessary to define keys within the `STORAGES` for the specific backend(s) you wish to configure.
|
||||
|
||||
### Environment Variables and Third-Party Libraries
|
||||
|
||||
NetBox uses an explicit Python configuration approach rather than automatic environment variable detection. While this provides clear configuration management and version control capabilities, it affects how some third-party libraries like `django-storages` function within NetBox's context.
|
||||
|
||||
Many Django libraries (including `django-storages`) expect to automatically detect environment variables like `AWS_STORAGE_BUCKET_NAME` or `AWS_S3_ACCESS_KEY_ID`. However, NetBox's configuration processing prevents this automatic detection from working as documented in some of these libraries.
|
||||
|
||||
When using third-party libraries that rely on environment variable detection, you may need to explicitly read environment variables in your NetBox `configuration.py`:
|
||||
|
||||
```python
|
||||
import os
|
||||
|
||||
STORAGES = {
|
||||
'default': {
|
||||
'BACKEND': 'storages.backends.s3.S3Storage',
|
||||
'OPTIONS': {
|
||||
'bucket_name': os.environ.get('AWS_STORAGE_BUCKET_NAME'),
|
||||
'access_key': os.environ.get('AWS_S3_ACCESS_KEY_ID'),
|
||||
'secret_key': os.environ.get('AWS_S3_SECRET_ACCESS_KEY'),
|
||||
'endpoint_url': os.environ.get('AWS_S3_ENDPOINT_URL'),
|
||||
'location': 'media/',
|
||||
}
|
||||
},
|
||||
'staticfiles': {
|
||||
'BACKEND': 'storages.backends.s3.S3Storage',
|
||||
'OPTIONS': {
|
||||
'bucket_name': os.environ.get('AWS_STORAGE_BUCKET_NAME'),
|
||||
'access_key': os.environ.get('AWS_S3_ACCESS_KEY_ID'),
|
||||
'secret_key': os.environ.get('AWS_S3_SECRET_ACCESS_KEY'),
|
||||
'endpoint_url': os.environ.get('AWS_S3_ENDPOINT_URL'),
|
||||
'location': 'static/',
|
||||
}
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
This approach works because the environment variables are resolved during NetBox's configuration processing, before the third-party library attempts its own environment variable detection.
|
||||
|
||||
!!! warning "Configuration Behavior"
|
||||
Simply setting environment variables like `AWS_STORAGE_BUCKET_NAME` without explicitly reading them in your configuration will not work. The variables must be read using `os.environ.get()` within your `configuration.py` file.
|
||||
|
||||
---
|
||||
|
||||
## TIME_ZONE
|
||||
|
||||
@@ -22,24 +22,9 @@ Stores registration made using `netbox.denormalized.register()`. For each model,
|
||||
|
||||
### `model_features`
|
||||
|
||||
A dictionary of particular features (e.g. custom fields) mapped to the NetBox models which support them, arranged by app. For example:
|
||||
A dictionary of model features (e.g. custom fields, tags, etc.) mapped to the functions used to qualify a model as supporting each feature. Model features are registered using the `register_model_feature()` function in `netbox.utils`.
|
||||
|
||||
```python
|
||||
{
|
||||
'custom_fields': {
|
||||
'circuits': ['provider', 'circuit'],
|
||||
'dcim': ['site', 'rack', 'devicetype', ...],
|
||||
...
|
||||
},
|
||||
'event_rules': {
|
||||
'extras': ['configcontext', 'tag', ...],
|
||||
'dcim': ['site', 'rack', 'devicetype', ...],
|
||||
},
|
||||
...
|
||||
}
|
||||
```
|
||||
|
||||
Supported model features are listed in the [features matrix](./models.md#features-matrix).
|
||||
Core model features are listed in the [features matrix](./models.md#features-matrix).
|
||||
|
||||
### `models`
|
||||
|
||||
|
||||
@@ -10,19 +10,26 @@ The Django [content types](https://docs.djangoproject.com/en/stable/ref/contrib/
|
||||
|
||||
Depending on its classification, each NetBox model may support various features which enhance its operation. Each feature is enabled by inheriting from its designated mixin class, and some features also make use of the [application registry](./application-registry.md#model_features).
|
||||
|
||||
| Feature | Feature Mixin | Registry Key | Description |
|
||||
|------------------------------------------------------------|-------------------------|--------------------|-----------------------------------------------------------------------------------------|
|
||||
| [Change logging](../features/change-logging.md) | `ChangeLoggingMixin` | - | Changes to these objects are automatically recorded in the change log |
|
||||
| Cloning | `CloningMixin` | - | Provides the `clone()` method to prepare a copy |
|
||||
| [Custom fields](../customization/custom-fields.md) | `CustomFieldsMixin` | `custom_fields` | These models support the addition of user-defined fields |
|
||||
| [Custom links](../customization/custom-links.md) | `CustomLinksMixin` | `custom_links` | These models support the assignment of custom links |
|
||||
| [Custom validation](../customization/custom-validation.md) | `CustomValidationMixin` | - | Supports the enforcement of custom validation rules |
|
||||
| [Export templates](../customization/export-templates.md) | `ExportTemplatesMixin` | `export_templates` | Users can create custom export templates for these models |
|
||||
| [Job results](../features/background-jobs.md) | `JobsMixin` | `jobs` | Background jobs can be scheduled for these models |
|
||||
| [Journaling](../features/journaling.md) | `JournalingMixin` | `journaling` | These models support persistent historical commentary |
|
||||
| [Synchronized data](../integrations/synchronized-data.md) | `SyncedDataMixin` | `synced_data` | Certain model data can be automatically synchronized from a remote data source |
|
||||
| [Tagging](../models/extras/tag.md) | `TagsMixin` | `tags` | The models can be tagged with user-defined tags |
|
||||
| [Event rules](../features/event-rules.md) | `EventRulesMixin` | `event_rules` | Event rules can send webhooks or run custom scripts automatically in response to events |
|
||||
| Feature | Feature Mixin | Registry Key | Description |
|
||||
|------------------------------------------------------------|-------------------------|---------------------|-----------------------------------------------------------------------------------------|
|
||||
| [Bookmarks](../features/customization.md#bookmarks) | `BookmarksMixin` | `bookmarks` | These models can be bookmarked natively in the user interface |
|
||||
| [Change logging](../features/change-logging.md) | `ChangeLoggingMixin` | `change_logging` | Changes to these objects are automatically recorded in the change log |
|
||||
| Cloning | `CloningMixin` | `cloning` | Provides the `clone()` method to prepare a copy |
|
||||
| [Contacts](../features/contacts.md) | `ContactsMixin` | `contacts` | Contacts can be associated with these models |
|
||||
| [Custom fields](../customization/custom-fields.md) | `CustomFieldsMixin` | `custom_fields` | These models support the addition of user-defined fields |
|
||||
| [Custom links](../customization/custom-links.md) | `CustomLinksMixin` | `custom_links` | These models support the assignment of custom links |
|
||||
| [Custom validation](../customization/custom-validation.md) | `CustomValidationMixin` | - | Supports the enforcement of custom validation rules |
|
||||
| [Event rules](../features/event-rules.md) | `EventRulesMixin` | `event_rules` | Event rules can send webhooks or run custom scripts automatically in response to events |
|
||||
| [Export templates](../customization/export-templates.md) | `ExportTemplatesMixin` | `export_templates` | Users can create custom export templates for these models |
|
||||
| [Image attachments](../models/extras/imageattachment.md) | `ImageAttachmentsMixin` | `image_attachments` | Image uploads can be attached to these models |
|
||||
| [Jobs](../features/background-jobs.md) | `JobsMixin` | `jobs` | Background jobs can be scheduled for these models |
|
||||
| [Journaling](../features/journaling.md) | `JournalingMixin` | `journaling` | These models support persistent historical commentary |
|
||||
| [Notifications](../features/notifications.md) | `NotificationsMixin` | `notifications` | These models support user notifications |
|
||||
| [Synchronized data](../integrations/synchronized-data.md) | `SyncedDataMixin` | `synced_data` | Certain model data can be automatically synchronized from a remote data source |
|
||||
| [Tagging](../models/extras/tag.md) | `TagsMixin` | `tags` | The models can be tagged with user-defined tags |
|
||||
|
||||
!!! note
|
||||
The above listed features are supported natively by NetBox. Beginning with NetBox v4.4.0, plugins can register their own model features as well.
|
||||
|
||||
## Models Index
|
||||
|
||||
|
||||
@@ -31,28 +31,14 @@ Close the [release milestone](https://github.com/netbox-community/netbox/milesto
|
||||
|
||||
Check that a link to the release notes for the new version is present in the navigation menu (defined in `mkdocs.yml`), and that a summary of all major new features has been added to `docs/index.md`.
|
||||
|
||||
### Update the Dependency Requirements Matrix
|
||||
|
||||
For every minor release, update the dependency requirements matrix in `docs/installation/upgrading.md` ("All versions") to reflect the supported versions of Python, PostgreSQL, and Redis:
|
||||
|
||||
1. Add a new row with the supported dependency versions.
|
||||
2. Include a documentation link using the release tag format: `https://github.com/netbox-community/netbox/blob/v4.2.0/docs/installation/index.md`
|
||||
3. Bold any version changes for clarity.
|
||||
|
||||
**Example Update:**
|
||||
|
||||
```markdown
|
||||
| NetBox Version | Python min | Python max | PostgreSQL min | Redis min | Documentation |
|
||||
|:--------------:|:----------:|:----------:|:--------------:|:---------:|:-------------------------------------------------------------------------------------------------:|
|
||||
| 4.2 | 3.10 | 3.12 | **13** | 4.0 | [Link](https://github.com/netbox-community/netbox/blob/v4.2.0/docs/installation/index.md) |
|
||||
```
|
||||
|
||||
### Update System Requirements
|
||||
|
||||
If a new Django release is adopted or other major dependencies (Python, PostgreSQL, Redis) change:
|
||||
|
||||
* Update the installation guide (`docs/installation/index.md`) with the new minimum versions.
|
||||
* Update the upgrade guide (`docs/installation/upgrading.md`) for the current version accordingly.
|
||||
* Update the upgrade guide (`docs/installation/upgrading.md`) for the current version.
|
||||
* Update the minimum versions for each dependency.
|
||||
* Add a new row to the release history table. Bold any version changes for clarity.
|
||||
* Update the minimum PostgreSQL version in the programming error template (`netbox/templates/exceptions/programming_error.html`).
|
||||
* Update the minimum and supported Python versions in the project metadata file (`pyproject.toml`)
|
||||
|
||||
@@ -137,16 +123,6 @@ $ node bundle.js
|
||||
Done in 1.00s.
|
||||
```
|
||||
|
||||
### Rebuild the Device Type Definition Schema
|
||||
|
||||
Run the following command to update the device type definition validation schema:
|
||||
|
||||
```nohighlight
|
||||
./manage.py buildschema --write
|
||||
```
|
||||
|
||||
This will automatically update the schema file at `contrib/generated_schema.json`.
|
||||
|
||||
### Update & Compile Translations
|
||||
|
||||
Updated language translations should be pulled from [Transifex](https://app.transifex.com/netbox-community/netbox/dashboard/) and re-compiled for each new release. First, retrieve any updated translation files using the Transifex CLI client:
|
||||
@@ -174,6 +150,24 @@ Then, compile these portable (`.po`) files for use in the application:
|
||||
!!! tip
|
||||
Put yourself in the shoes of the user when recording change notes. Focus on the effect that each change has for the end user, rather than the specific bits of code that were modified in a PR. Ensure that each message conveys meaning absent context of the initial feature request or bug report. Remember to include keywords or phrases (such as exception names) that can be easily searched.
|
||||
|
||||
### Rebuild the Device Type Definition Schema
|
||||
|
||||
Run the following command to update the device type definition validation schema:
|
||||
|
||||
```nohighlight
|
||||
./manage.py buildschema --write
|
||||
```
|
||||
|
||||
This will automatically update the schema file at `contrib/generated_schema.json`.
|
||||
|
||||
### Update the OpenAPI Schema
|
||||
|
||||
Update the static OpenAPI schema definition at `contrib/openapi.json` with the management command below. If the schema file is up-to-date, only the NetBox version will be changed.
|
||||
|
||||
```nohighlight
|
||||
./manage.py spectacular --format openapi-json > ../contrib/openapi.json
|
||||
```
|
||||
|
||||
### Submit a Pull Request
|
||||
|
||||
Commit the above changes and submit a pull request titled **"Release vX.Y.Z"** to merge the current release branch (e.g. `release-vX.Y.Z`) into `main`. Copy the documented release notes into the pull request's body.
|
||||
|
||||
@@ -17,7 +17,7 @@ Dedicate some time to take stock of your own sources of truth for your infrastru
|
||||
|
||||
* **Multiple conflicting sources** for a given domain. For example, there may be multiple versions of a spreadsheet circulating, each of which asserts a conflicting set of data.
|
||||
* **Sources with no domain defined.** You may encounter that different teams within your organization use different tools for the same purpose, with no normal definition of when either should be used.
|
||||
* **Inaccessible data formatting.** Some tools are better suited for programmatic usage than others. For example, spreadsheets are generally very easy to parse and export, however free-form notes on wiki or similar application are much more difficult to consume.
|
||||
* **Inaccessible data formatting.** Some tools are better suited for programmatic usage than others. For example, spreadsheets are generally very easy to parse and export; however, free-form notes on wiki or similar application are much more difficult to consume.
|
||||
* **There is no source of truth.** Sometimes you'll find that a source of truth simply doesn't exist for a domain. For example, when assigning IP addresses, operators may be just using any (presumed) available IP from a subnet without ever recording its usage.
|
||||
|
||||
See if you can identify each domain of infrastructure data for your organization, and the source of truth for each. Once you have these compiled, you'll need to determine what belongs in NetBox.
|
||||
|
||||
@@ -66,7 +66,7 @@ The top level is the project root, which can have any name that you like. Immedi
|
||||
* `README.md` - A brief introduction to your plugin, how to install and configure it, where to find help, and any other pertinent information. It is recommended to write `README` files using a markup language such as Markdown to enable human-friendly display.
|
||||
* The plugin source directory. This must be a valid Python package name, typically comprising only lowercase letters, numbers, and underscores.
|
||||
|
||||
The plugin source directory contains all the actual Python code and other resources used by your plugin. Its structure is left to the author's discretion, however it is recommended to follow best practices as outlined in the [Django documentation](https://docs.djangoproject.com/en/stable/intro/reusable-apps/). At a minimum, this directory **must** contain an `__init__.py` file containing an instance of NetBox's `PluginConfig` class, discussed below.
|
||||
The plugin source directory contains all the actual Python code and other resources used by your plugin. Its structure is left to the author's discretion; however, it is recommended to follow best practices as outlined in the [Django documentation](https://docs.djangoproject.com/en/stable/intro/reusable-apps/). At a minimum, this directory **must** contain an `__init__.py` file containing an instance of NetBox's `PluginConfig` class, discussed below.
|
||||
|
||||
**Note:** The [Cookiecutter NetBox Plugin](https://github.com/netbox-community/cookiecutter-netbox-plugin) can be used to auto-generate all the needed directories and files for a new plugin.
|
||||
|
||||
@@ -186,7 +186,7 @@ Many of these are self-explanatory, but for more information, see the [pyproject
|
||||
|
||||
## Create a Virtual Environment
|
||||
|
||||
It is strongly recommended to create a Python [virtual environment](https://docs.python.org/3/tutorial/venv.html) for the development of your plugin, as opposed to using system-wide packages. This will afford you complete control over the installed versions of all dependencies and avoid conflict with system packages. This environment can live wherever you'd like, however it should be excluded from revision control. (A popular convention is to keep all virtual environments in the user's home directory, e.g. `~/.virtualenvs/`.)
|
||||
It is strongly recommended to create a Python [virtual environment](https://docs.python.org/3/tutorial/venv.html) for the development of your plugin, as opposed to using system-wide packages. This will afford you complete control over the installed versions of all dependencies and avoid conflict with system packages. This environment can live wherever you'd like;however, it should be excluded from revision control. (A popular convention is to keep all virtual environments in the user's home directory, e.g. `~/.virtualenvs/`.)
|
||||
|
||||
```shell
|
||||
python3 -m venv ~/.virtualenvs/my_plugin
|
||||
|
||||
@@ -24,20 +24,7 @@ Every model includes by default a numeric primary key. This value is generated a
|
||||
|
||||
## Enabling NetBox Features
|
||||
|
||||
Plugin models can leverage certain NetBox features by inheriting from NetBox's `NetBoxModel` class. This class extends the plugin model to enable features unique to NetBox, including:
|
||||
|
||||
* Bookmarks
|
||||
* Change logging
|
||||
* Cloning
|
||||
* Custom fields
|
||||
* Custom links
|
||||
* Custom validation
|
||||
* Export templates
|
||||
* Journaling
|
||||
* Tags
|
||||
* Webhooks
|
||||
|
||||
This class performs two crucial functions:
|
||||
Plugin models can leverage certain [model features](../../development/models.md#features-matrix) (such as tags, custom fields, event rules, etc.) by inheriting from NetBox's `NetBoxModel` class. This class performs two crucial functions:
|
||||
|
||||
1. Apply any fields, methods, and/or attributes necessary to the operation of these features
|
||||
2. Register the model with NetBox as utilizing these features
|
||||
@@ -135,6 +122,27 @@ For more information about database migrations, see the [Django documentation](h
|
||||
|
||||
::: netbox.models.features.TagsMixin
|
||||
|
||||
## Custom Model Features
|
||||
|
||||
In addition to utilizing the model features provided natively by NetBox (listed above), plugins can register their own model features. This is done using the `register_model_feature()` function from `netbox.utils`. This function takes two arguments: a feature name, and a callable which accepts a model class. The callable must return a boolean value indicting whether the given model supports the named feature.
|
||||
|
||||
This function can be used as a decorator:
|
||||
|
||||
```python
|
||||
@register_model_feature('foo')
|
||||
def supports_foo(model):
|
||||
# Your logic here
|
||||
```
|
||||
|
||||
Or it can be called directly:
|
||||
|
||||
```python
|
||||
register_model_feature('foo', supports_foo)
|
||||
```
|
||||
|
||||
!!! tip
|
||||
Consider performing feature registration inside your PluginConfig's `ready()` method.
|
||||
|
||||
## Choice Sets
|
||||
|
||||
For model fields which support the selection of one or more values from a predefined list of choices, NetBox provides the `ChoiceSet` utility class. This can be used in place of a regular choices tuple to provide enhanced functionality, namely dynamic configuration and colorization. (See [Django's documentation](https://docs.djangoproject.com/en/stable/ref/models/fields/#choices) on the `choices` parameter for supported model fields.)
|
||||
|
||||
@@ -47,10 +47,19 @@ table.configure(request)
|
||||
|
||||
This will automatically apply any user-specific preferences for the table. (If using a generic view provided by NetBox, table configuration is handled automatically.)
|
||||
|
||||
|
||||
### Bulk Edit and Delete Actions
|
||||
|
||||
Bulk edit and delete buttons are automatically added to the table, if there is an appropriate view registered to the `${modelname}_bulk_edit` or `${modelname}_bulk_delete` path name.
|
||||
|
||||
## Columns
|
||||
|
||||
The table column classes listed below are supported for use in plugins. These classes can be imported from `netbox.tables.columns`.
|
||||
|
||||
::: netbox.tables.ArrayColumn
|
||||
options:
|
||||
members: false
|
||||
|
||||
::: netbox.tables.BooleanColumn
|
||||
options:
|
||||
members: false
|
||||
|
||||
@@ -89,7 +89,7 @@ The following condition will evaluate as true:
|
||||
```
|
||||
|
||||
!!! note "Evaluating static choice fields"
|
||||
Pay close attention when evaluating static choice fields, such as the `status` field above. These fields typically render as a dictionary specifying both the field's raw value (`value`) and its human-friendly label (`label`). be sure to specify on which of these you want to match.
|
||||
Pay close attention when evaluating static choice fields, such as the `status` field above. These fields typically render as a dictionary specifying both the field's raw value (`value`) and its human-friendly label (`label`). Be sure to specify on which of these you want to match.
|
||||
|
||||
## Condition Sets
|
||||
|
||||
|
||||
@@ -357,7 +357,7 @@ And the response:
|
||||
...
|
||||
```
|
||||
|
||||
All GraphQL requests are made at the `/graphql` URL (which also serves the GraphiQL UI). The API is currently read-only, however users who wish to disable it until needed can do so by setting the `GRAPHQL_ENABLED` configuration parameter to False. For more detail on NetBox's GraphQL implementation, see [the GraphQL API documentation](../integrations/graphql-api.md).
|
||||
All GraphQL requests are made at the `/graphql` URL (which also serves the GraphiQL UI). The API is currently read-only; however, users who wish to disable it until needed can do so by setting the `GRAPHQL_ENABLED` configuration parameter to False. For more detail on NetBox's GraphQL implementation, see [the GraphQL API documentation](../integrations/graphql-api.md).
|
||||
|
||||
#### IP Ranges ([#834](https://github.com/netbox-community/netbox/issues/834))
|
||||
|
||||
|
||||
@@ -1,5 +1,29 @@
|
||||
# NetBox v4.3
|
||||
|
||||
## v4.3.7 (2025-08-26)
|
||||
|
||||
### Enhancements
|
||||
|
||||
* [#18147](https://github.com/netbox-community/netbox/issues/18147) - Add device & VM interface counts under related objects for VRFs
|
||||
* [#19990](https://github.com/netbox-community/netbox/issues/19990) - Button to add a missing prerequisite now includes a return URL
|
||||
* [#20122](https://github.com/netbox-community/netbox/issues/20122) - Improve color contrast of highlighted data under changelog diff view
|
||||
* [#20131](https://github.com/netbox-community/netbox/issues/20131) - Add object selector for interface to the MAC address edit form
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* [#18916](https://github.com/netbox-community/netbox/issues/18916) - Fix dynamic dropdown selection styling for required fields when no selection is made
|
||||
* [#19645](https://github.com/netbox-community/netbox/issues/19645) - Fix interface selection when adding a cable for a virtual chassis master
|
||||
* [#19669](https://github.com/netbox-community/netbox/issues/19669) - Restore token authentication support for fetching media assets
|
||||
* [#19970](https://github.com/netbox-community/netbox/issues/19970) - Device role child device counts should be cumulative
|
||||
* [#20012](https://github.com/netbox-community/netbox/issues/20012) - Fix support for `empty` filter lookup on custom fields
|
||||
* [#20043](https://github.com/netbox-community/netbox/issues/20043) - Fix page styling when rack elevations are embedded
|
||||
* [#20098](https://github.com/netbox-community/netbox/issues/20098) - Fix `AttributeError` exception when assigning tags during bulk import
|
||||
* [#20120](https://github.com/netbox-community/netbox/issues/20120) - Fix REST API serialization of jobs under `/api/core/background-tasks/`
|
||||
* [#20157](https://github.com/netbox-community/netbox/issues/20157) - Fix `IntegrityError` exception when a duplicate notification is triggered
|
||||
* [#20164](https://github.com/netbox-community/netbox/issues/20164) - Fix `ValueError` exception when attempting to add power outlets to devices in bulk
|
||||
|
||||
---
|
||||
|
||||
## v4.3.6 (2025-08-12)
|
||||
|
||||
### Enhancements
|
||||
@@ -29,6 +53,8 @@
|
||||
* [#20033](https://github.com/netbox-community/netbox/issues/20033) - Fix `TypeError` exception when bulk deleting bookmarks
|
||||
* [#20056](https://github.com/netbox-community/netbox/issues/20056) - Fixed missing RF role options in device type schema validation
|
||||
|
||||
---
|
||||
|
||||
## v4.3.5 (2025-07-29)
|
||||
|
||||
### Enhancements
|
||||
@@ -48,6 +74,8 @@
|
||||
!!! note "Plugin Developer Advisory"
|
||||
The fix for bug [#18900](https://github.com/netbox-community/netbox/issues/18900) now raises explicit exceptions when API endpoints attempt to paginate unordered querysets. Plugin maintainers should review their API viewsets to ensure proper queryset ordering is applied before pagination, either by using `.order_by()` on querysets or by setting `ordering` in model Meta classes. Previously silent pagination issues in plugin code will now raise `QuerySetNotOrdered` exceptions and may require updates to maintain compatibility.
|
||||
|
||||
---
|
||||
|
||||
## v4.3.4 (2025-07-15)
|
||||
|
||||
### Enhancements
|
||||
|
||||
@@ -1,6 +1,48 @@
|
||||
# NetBox v4.4
|
||||
|
||||
## v4.4.0 (FUTURE)
|
||||
## v4.4.1 (2025-09-16)
|
||||
|
||||
### Enhancements
|
||||
|
||||
* [#15492](https://github.com/netbox-community/netbox/issues/15492) - Enable cloning of permissions
|
||||
* [#16381](https://github.com/netbox-community/netbox/issues/16381) - Display script result timestamps in system timezone
|
||||
* [#19262](https://github.com/netbox-community/netbox/issues/19262) - No longer restrict FHRP group assignment by assigned IP address
|
||||
* [#19408](https://github.com/netbox-community/netbox/issues/19408) - Support export templates for circuit terminations and virtual circuit terminations
|
||||
* [#19428](https://github.com/netbox-community/netbox/issues/19428) - Add an optional U height field to the devices table
|
||||
* [#19547](https://github.com/netbox-community/netbox/issues/19547) - Add individual "sync" buttons in data sources table
|
||||
* [#19865](https://github.com/netbox-community/netbox/issues/19865) - Reorganize cable type groupings
|
||||
* [#20222](https://github.com/netbox-community/netbox/issues/20222) - Enable the `HttpOnly` flag for CSRF cookie
|
||||
* [#20237](https://github.com/netbox-community/netbox/issues/20237) - Include VPN tunnel groups in global search results
|
||||
* [#20241](https://github.com/netbox-community/netbox/issues/20241) - Record A & B terminations in cable changelog data
|
||||
* [#20277](https://github.com/netbox-community/netbox/issues/20277) - Add support for attribute assignment to `deserialize_object()` utility
|
||||
* [#20321](https://github.com/netbox-community/netbox/issues/20321) - Add physical media types for transceiver interfaces
|
||||
* [#20347](https://github.com/netbox-community/netbox/issues/20347) - Add Wi-Fi Alliance aliases to 802.11 interface types
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* [#19729](https://github.com/netbox-community/netbox/issues/19729) - Restore `kind` filter for interfaces in GraphQL API
|
||||
* [#19744](https://github.com/netbox-community/netbox/issues/19744) - Plugins list should be orderable by "active" column
|
||||
* [#19851](https://github.com/netbox-community/netbox/issues/19851) - Fix `ValueError` complaining of missing `scope` when bulk importing wireless LANs
|
||||
* [#19896](https://github.com/netbox-community/netbox/issues/19896) - Min/max values for decimal custom fields should accept decimal values
|
||||
* [#20197](https://github.com/netbox-community/netbox/issues/20197) - Correct validation for virtual chassis parent interface
|
||||
* [#20215](https://github.com/netbox-community/netbox/issues/20215) - All GraphQL filters for config contexts should be optional
|
||||
* [#20217](https://github.com/netbox-community/netbox/issues/20217) - Remove "0 VLANs available" row at end of VLAN range table
|
||||
* [#20221](https://github.com/netbox-community/netbox/issues/20221) - JSON fields should not coerce empty dictionaries to null
|
||||
* [#20227](https://github.com/netbox-community/netbox/issues/20227) - Ensure consistent padding of Markdown content
|
||||
* [#20234](https://github.com/netbox-community/netbox/issues/20234) - Fix "add" button link for prerequisite object warning in UI
|
||||
* [#20236](https://github.com/netbox-community/netbox/issues/20236) - Strip invalid characters from uploaded image file names
|
||||
* [#20238](https://github.com/netbox-community/netbox/issues/20238) - Fix support for outside IP assignment during bulk import of tunnel terminations
|
||||
* [#20242](https://github.com/netbox-community/netbox/issues/20242) - Avoid `AttributeError` exception on background jobs with no request ID
|
||||
* [#20252](https://github.com/netbox-community/netbox/issues/20252) - Remove generic AddObject from ObjectChildrenView to prevent duplicate "add" buttons
|
||||
* [#20264](https://github.com/netbox-community/netbox/issues/20264) - Fix rendering of default icon in plugins list
|
||||
* [#20272](https://github.com/netbox-community/netbox/issues/20272) - ConfigContexts assigned to ancestor locations should apply to device/VM
|
||||
* [#20282](https://github.com/netbox-community/netbox/issues/20282) - Fix styling of prerequisite objects warning
|
||||
* [#20298](https://github.com/netbox-community/netbox/issues/20298) - Display a placeholder when an image thumbnail fails to load
|
||||
* [#20327](https://github.com/netbox-community/netbox/issues/20327) - Avoid calling `distinct()` on device/VM queryset when fetching config context data
|
||||
|
||||
---
|
||||
|
||||
## v4.4.0 (2025-09-02)
|
||||
|
||||
### New Features
|
||||
|
||||
@@ -45,6 +87,8 @@ A new ConfigContextProfile model has been introduced to support JSON schema vali
|
||||
* [#18006](https://github.com/netbox-community/netbox/issues/18006) - A Javascript is now triggered when UI is toggled between light and dark mode
|
||||
* [#19735](https://github.com/netbox-community/netbox/issues/19735) - Custom individual and bulk operations can now be registered under individual views using `ObjectAction`
|
||||
* [#20003](https://github.com/netbox-community/netbox/issues/20003) - Enable registration of callbacks to provide supplementary webhook payload data
|
||||
* [#20115](https://github.com/netbox-community/netbox/issues/20115) - Support the use of ArrayColumn for plugin tables
|
||||
* [#20129](https://github.com/netbox-community/netbox/issues/20129) - Enable plugins to register custom model features
|
||||
|
||||
### Deprecations
|
||||
|
||||
|
||||
@@ -30,6 +30,8 @@ plugins:
|
||||
python:
|
||||
paths: ["netbox"]
|
||||
options:
|
||||
docstring_options:
|
||||
warn_missing_types: false
|
||||
heading_level: 3
|
||||
members_order: source
|
||||
show_root_heading: true
|
||||
|
||||
@@ -6,7 +6,6 @@ from django.urls import reverse
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from circuits.choices import *
|
||||
from circuits.constants import *
|
||||
from dcim.models import CabledObjectModel
|
||||
from netbox.models import ChangeLoggedModel, OrganizationalModel, PrimaryModel
|
||||
from netbox.models.mixins import DistanceMixin
|
||||
@@ -231,6 +230,7 @@ class CircuitGroupAssignment(CustomFieldsMixin, ExportTemplatesMixin, TagsMixin,
|
||||
class CircuitTermination(
|
||||
CustomFieldsMixin,
|
||||
CustomLinksMixin,
|
||||
ExportTemplatesMixin,
|
||||
TagsMixin,
|
||||
ChangeLoggedModel,
|
||||
CabledObjectModel
|
||||
|
||||
@@ -8,7 +8,7 @@ from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from circuits.choices import *
|
||||
from netbox.models import ChangeLoggedModel, PrimaryModel
|
||||
from netbox.models.features import CustomFieldsMixin, CustomLinksMixin, TagsMixin
|
||||
from netbox.models.features import CustomFieldsMixin, CustomLinksMixin, ExportTemplatesMixin, TagsMixin
|
||||
from .base import BaseCircuitType
|
||||
|
||||
__all__ = (
|
||||
@@ -121,6 +121,7 @@ class VirtualCircuit(PrimaryModel):
|
||||
class VirtualCircuitTermination(
|
||||
CustomFieldsMixin,
|
||||
CustomLinksMixin,
|
||||
ExportTemplatesMixin,
|
||||
TagsMixin,
|
||||
ChangeLoggedModel
|
||||
):
|
||||
|
||||
@@ -18,8 +18,8 @@ class BackgroundTaskSerializer(serializers.Serializer):
|
||||
description = serializers.CharField()
|
||||
origin = serializers.CharField()
|
||||
func_name = serializers.CharField()
|
||||
args = serializers.ListField(child=serializers.CharField())
|
||||
kwargs = serializers.DictField()
|
||||
args = serializers.SerializerMethodField()
|
||||
kwargs = serializers.SerializerMethodField()
|
||||
result = serializers.CharField()
|
||||
timeout = serializers.IntegerField()
|
||||
result_ttl = serializers.IntegerField()
|
||||
@@ -42,6 +42,16 @@ class BackgroundTaskSerializer(serializers.Serializer):
|
||||
is_scheduled = serializers.BooleanField()
|
||||
is_stopped = serializers.BooleanField()
|
||||
|
||||
def get_args(self, obj) -> list:
|
||||
return [
|
||||
str(arg) for arg in obj.args
|
||||
]
|
||||
|
||||
def get_kwargs(self, obj) -> dict:
|
||||
return {
|
||||
key: str(value) for key, value in obj.kwargs.items()
|
||||
}
|
||||
|
||||
def get_position(self, obj) -> int:
|
||||
return obj.get_position()
|
||||
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
import inspect
|
||||
from collections import defaultdict
|
||||
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
@@ -64,6 +65,9 @@ class ObjectTypeManager(models.Manager):
|
||||
Retrieve or create and return the ObjectType for a model.
|
||||
"""
|
||||
from netbox.models.features import get_model_features, model_is_public
|
||||
|
||||
if not inspect.isclass(model):
|
||||
model = model.__class__
|
||||
opts = self._get_opts(model, for_concrete_model)
|
||||
|
||||
try:
|
||||
@@ -75,7 +79,7 @@ class ObjectTypeManager(models.Manager):
|
||||
app_label=opts.app_label,
|
||||
model=opts.model_name,
|
||||
public=model_is_public(model),
|
||||
features=get_model_features(model.__class__),
|
||||
features=get_model_features(model),
|
||||
)[0]
|
||||
|
||||
return ot
|
||||
@@ -93,6 +97,8 @@ class ObjectTypeManager(models.Manager):
|
||||
needed_models = defaultdict(set)
|
||||
needed_opts = defaultdict(list)
|
||||
for model in models:
|
||||
if not inspect.isclass(model):
|
||||
model = model.__class__
|
||||
opts = self._get_opts(model, for_concrete_models)
|
||||
needed_models[opts.app_label].add(opts.model_name)
|
||||
needed_opts[(opts.app_label, opts.model_name)].append(model)
|
||||
@@ -117,7 +123,7 @@ class ObjectTypeManager(models.Manager):
|
||||
app_label=app_label,
|
||||
model=model_name,
|
||||
public=model_is_public(model),
|
||||
features=get_model_features(model.__class__),
|
||||
features=get_model_features(model),
|
||||
)
|
||||
|
||||
return results
|
||||
@@ -135,9 +141,9 @@ class ObjectTypeManager(models.Manager):
|
||||
"""
|
||||
Return ObjectTypes only for models which support the given feature.
|
||||
|
||||
Only ObjectTypes which list the specified feature will be included. Supported features are declared in
|
||||
netbox.models.features.FEATURES_MAP. For example, we can find all ObjectTypes for models which support event
|
||||
rules with:
|
||||
Only ObjectTypes which list the specified feature will be included. Supported features are declared in the
|
||||
application registry under `registry["model_features"]`. For example, we can find all ObjectTypes for models
|
||||
which support event rules with:
|
||||
|
||||
ObjectType.objects.with_feature('event_rules')
|
||||
"""
|
||||
|
||||
@@ -4,6 +4,7 @@ import django_tables2 as tables
|
||||
from core.models import *
|
||||
from netbox.tables import NetBoxTable, columns
|
||||
from .columns import BackendTypeColumn
|
||||
from .template_code import DATA_SOURCE_SYNC_BUTTON
|
||||
|
||||
__all__ = (
|
||||
'DataFileTable',
|
||||
@@ -37,6 +38,9 @@ class DataSourceTable(NetBoxTable):
|
||||
tags = columns.TagColumn(
|
||||
url_name='core:datasource_list',
|
||||
)
|
||||
actions = columns.ActionsColumn(
|
||||
extra_buttons=DATA_SOURCE_SYNC_BUTTON,
|
||||
)
|
||||
|
||||
class Meta(NetBoxTable.Meta):
|
||||
model = DataSource
|
||||
|
||||
@@ -1,10 +1,8 @@
|
||||
import django_tables2 as tables
|
||||
from django.urls import reverse
|
||||
from django.utils.safestring import mark_safe
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from netbox.tables import BaseTable, columns
|
||||
from .template_code import PLUGIN_IS_INSTALLED
|
||||
from .template_code import PLUGIN_IS_INSTALLED, PLUGIN_NAME_TEMPLATE
|
||||
|
||||
__all__ = (
|
||||
'CatalogPluginTable',
|
||||
@@ -12,12 +10,6 @@ __all__ = (
|
||||
)
|
||||
|
||||
|
||||
PLUGIN_NAME_TEMPLATE = """
|
||||
<img class="plugin-icon" src="{{ record.icon_url }}">
|
||||
<a href="{% url 'core:plugin' record.config_name %}">{{ record.title_long }}</a>
|
||||
"""
|
||||
|
||||
|
||||
class PluginVersionTable(BaseTable):
|
||||
version = tables.Column(
|
||||
verbose_name=_('Version')
|
||||
@@ -61,6 +53,7 @@ class CatalogPluginTable(BaseTable):
|
||||
verbose_name=_('Local')
|
||||
)
|
||||
is_installed = columns.TemplateColumn(
|
||||
accessor=tables.A('is_loaded'),
|
||||
verbose_name=_('Active'),
|
||||
template_code=PLUGIN_IS_INSTALLED
|
||||
)
|
||||
@@ -93,10 +86,4 @@ class CatalogPluginTable(BaseTable):
|
||||
)
|
||||
# List installed plugins first, then certified plugins, then
|
||||
# everything else (with each tranche ordered alphabetically)
|
||||
order_by = ('-is_installed', '-is_certified', 'name')
|
||||
|
||||
def render_title_long(self, value, record):
|
||||
if record.static:
|
||||
return value
|
||||
url = reverse('core:plugin', args=[record.config_name])
|
||||
return mark_safe(f"<a href='{url}'>{value}</a>")
|
||||
order_by = ('-is_installed', '-is_certified', 'title_long')
|
||||
|
||||
@@ -26,3 +26,29 @@ PLUGIN_IS_INSTALLED = """
|
||||
<span class="text-muted">—</span>
|
||||
{% endif %}
|
||||
"""
|
||||
|
||||
PLUGIN_NAME_TEMPLATE = """
|
||||
{% load static %}
|
||||
{% if record.icon_url %}
|
||||
<img class="plugin-icon" src="{{ record.icon_url }}">
|
||||
{% else %}
|
||||
<img class="plugin-icon" src="{% static 'plugin-default.svg' %}">
|
||||
{% endif %}
|
||||
<a href="{% url 'core:plugin' record.config_name %}">{{ record.title_long }}</a>
|
||||
"""
|
||||
|
||||
DATA_SOURCE_SYNC_BUTTON = """
|
||||
{% load helpers %}
|
||||
{% load i18n %}
|
||||
{% if perms.core.sync_datasource %}
|
||||
{% if record.ready_for_sync %}
|
||||
<button class="btn btn-primary btn-sm" type="submit" formaction="{% url 'core:datasource_sync' pk=record.pk %}?return_url={{ request.get_full_path|urlencode }}" formmethod="post">
|
||||
<i class="mdi mdi-sync" aria-hidden="true"></i> {% trans "Sync" %}
|
||||
</button>
|
||||
{% else %}
|
||||
<button class="btn btn-primary btn-sm" disabled>
|
||||
<i class="mdi mdi-sync" aria-hidden="true"></i> {% trans "Sync" %}
|
||||
</button>
|
||||
{% endif %}
|
||||
{% endif %}
|
||||
"""
|
||||
|
||||
@@ -33,7 +33,13 @@ from utilities.forms import ConfirmationForm
|
||||
from utilities.htmx import htmx_partial
|
||||
from utilities.json import ConfigJSONEncoder
|
||||
from utilities.query import count_related
|
||||
from utilities.views import ContentTypePermissionRequiredMixin, GetRelatedModelsMixin, ViewTab, register_model_view
|
||||
from utilities.views import (
|
||||
ContentTypePermissionRequiredMixin,
|
||||
GetRelatedModelsMixin,
|
||||
GetReturnURLMixin,
|
||||
ViewTab,
|
||||
register_model_view,
|
||||
)
|
||||
from . import filtersets, forms, tables
|
||||
from .jobs import SyncDataSourceJob
|
||||
from .models import *
|
||||
@@ -66,7 +72,7 @@ class DataSourceView(GetRelatedModelsMixin, generic.ObjectView):
|
||||
|
||||
|
||||
@register_model_view(DataSource, 'sync')
|
||||
class DataSourceSyncView(BaseObjectView):
|
||||
class DataSourceSyncView(GetReturnURLMixin, BaseObjectView):
|
||||
queryset = DataSource.objects.all()
|
||||
|
||||
def get_required_permission(self):
|
||||
@@ -85,7 +91,7 @@ class DataSourceSyncView(BaseObjectView):
|
||||
request,
|
||||
_("Queued job #{id} to sync {datasource}").format(id=job.pk, datasource=datasource)
|
||||
)
|
||||
return redirect(datasource.get_absolute_url())
|
||||
return redirect(self.get_return_url(request, datasource))
|
||||
|
||||
|
||||
@register_model_view(DataSource, 'add', detail=False)
|
||||
|
||||
@@ -1,3 +1,5 @@
|
||||
from rest_framework import serializers
|
||||
|
||||
from dcim.models import DeviceRole, InventoryItemRole
|
||||
from extras.api.serializers_.configtemplates import ConfigTemplateSerializer
|
||||
from netbox.api.fields import RelatedObjectCountField
|
||||
@@ -13,10 +15,8 @@ __all__ = (
|
||||
class DeviceRoleSerializer(NestedGroupModelSerializer):
|
||||
parent = NestedDeviceRoleSerializer(required=False, allow_null=True, default=None)
|
||||
config_template = ConfigTemplateSerializer(nested=True, required=False, allow_null=True, default=None)
|
||||
|
||||
# Related object counts
|
||||
device_count = RelatedObjectCountField('devices')
|
||||
virtualmachine_count = RelatedObjectCountField('virtual_machines')
|
||||
device_count = serializers.IntegerField(read_only=True, default=0)
|
||||
virtualmachine_count = serializers.IntegerField(read_only=True, default=0)
|
||||
|
||||
class Meta:
|
||||
model = DeviceRole
|
||||
|
||||
@@ -352,7 +352,19 @@ class InventoryItemTemplateViewSet(MPTTLockedMixin, NetBoxModelViewSet):
|
||||
#
|
||||
|
||||
class DeviceRoleViewSet(NetBoxModelViewSet):
|
||||
queryset = DeviceRole.objects.all()
|
||||
queryset = DeviceRole.objects.add_related_count(
|
||||
DeviceRole.objects.add_related_count(
|
||||
DeviceRole.objects.all(),
|
||||
VirtualMachine,
|
||||
'role',
|
||||
'virtualmachine_count',
|
||||
cumulative=True
|
||||
),
|
||||
Device,
|
||||
'role',
|
||||
'device_count',
|
||||
cumulative=True
|
||||
)
|
||||
serializer_class = serializers.DeviceRoleSerializer
|
||||
filterset_class = filtersets.DeviceRoleFilterSet
|
||||
|
||||
|
||||
@@ -889,22 +889,118 @@ class InterfaceTypeChoices(ChoiceSet):
|
||||
TYPE_BRIDGE = 'bridge'
|
||||
TYPE_LAG = 'lag'
|
||||
|
||||
# Ethernet
|
||||
# FastEthernet
|
||||
TYPE_100ME_FX = '100base-fx'
|
||||
TYPE_100ME_LFX = '100base-lfx'
|
||||
TYPE_100ME_FIXED = '100base-tx'
|
||||
TYPE_100ME_FIXED = '100base-tx' # TODO: Rename to _TX
|
||||
TYPE_100ME_T1 = '100base-t1'
|
||||
|
||||
# GigabitEthernet
|
||||
TYPE_1GE_BX10_D = '1000base-bx10-d'
|
||||
TYPE_1GE_BX10_U = '1000base-bx10-u'
|
||||
TYPE_1GE_CWDM = '1000base-cwdm'
|
||||
TYPE_1GE_CX = '1000base-cx'
|
||||
TYPE_1GE_DWDM = '1000base-dwdm'
|
||||
TYPE_1GE_EX = '1000base-ex'
|
||||
TYPE_1GE_SX_FIXED = '1000base-sx' # TODO: Drop _FIXED suffix
|
||||
TYPE_1GE_LSX = '1000base-lsx'
|
||||
TYPE_1GE_LX_FIXED = '1000base-lx' # TODO: Drop _FIXED suffix
|
||||
TYPE_1GE_LX10 = '1000base-lx10'
|
||||
TYPE_1GE_FIXED = '1000base-t' # TODO: Rename to _T
|
||||
TYPE_1GE_TX_FIXED = '1000base-tx' # TODO: Drop _FIXED suffix
|
||||
TYPE_1GE_ZX = '1000base-zx'
|
||||
|
||||
# 2.5/5 Gbps Ethernet
|
||||
TYPE_2GE_FIXED = '2.5gbase-t' # TODO: Rename to _T
|
||||
TYPE_5GE_FIXED = '5gbase-t' # TODO: Rename to _T
|
||||
|
||||
# 10 Gbps Ethernet
|
||||
TYPE_10GE_BR_D = '10gbase-br-d'
|
||||
TYPE_10GE_BR_U = '10gbase-br-u'
|
||||
TYPE_10GE_CX4 = '10gbase-cx4'
|
||||
TYPE_10GE_ER = '10gbase-er'
|
||||
TYPE_10GE_LR = '10gbase-lr'
|
||||
TYPE_10GE_LRM = '10gbase-lrm'
|
||||
TYPE_10GE_LX4 = '10gbase-lx4'
|
||||
TYPE_10GE_SR = '10gbase-sr'
|
||||
TYPE_10GE_FIXED = '10gbase-t'
|
||||
TYPE_10GE_ZR = '10gbase-zr'
|
||||
|
||||
# 25 Gbps Ethernet
|
||||
TYPE_25GE_CR = '25gbase-cr'
|
||||
TYPE_25GE_ER = '25gbase-er'
|
||||
TYPE_25GE_LR = '25gbase-lr'
|
||||
TYPE_25GE_SR = '25gbase-sr'
|
||||
TYPE_25GE_T = '25gbase-t'
|
||||
|
||||
# 40 Gbps Ethernet
|
||||
TYPE_40GE_CR4 = '40gbase-cr4'
|
||||
TYPE_40GE_ER4 = '40gbase-er4'
|
||||
TYPE_40GE_FR4 = '40gbase-fr4'
|
||||
TYPE_40GE_LR4 = '40gbase-lr4'
|
||||
TYPE_40GE_SR4 = '40gbase-sr4'
|
||||
|
||||
# 50 Gbps Ethernet
|
||||
TYPE_50GE_CR = '50gbase-cr'
|
||||
TYPE_50GE_ER = '50gbase-er'
|
||||
TYPE_50GE_FR = '50gbase-fr'
|
||||
TYPE_50GE_LR = '50gbase-lr'
|
||||
TYPE_50GE_SR = '50gbase-sr'
|
||||
|
||||
# 100 Gbps Ethernet
|
||||
TYPE_100GE_CR1 = '100gbase-cr1'
|
||||
TYPE_100GE_CR2 = '100gbase-cr2'
|
||||
TYPE_100GE_CR4 = '100gbase-cr4'
|
||||
TYPE_100GE_CR10 = '100gbase-cr10'
|
||||
TYPE_100GE_CWDM4 = '100gbase-cwdm4'
|
||||
TYPE_100GE_DR = '100gbase-dr'
|
||||
TYPE_100GE_FR1 = '100gbase-fr1'
|
||||
TYPE_100GE_ER4 = '100gbase-er4'
|
||||
TYPE_100GE_LR1 = '100gbase-lr1'
|
||||
TYPE_100GE_LR4 = '100gbase-lr4'
|
||||
TYPE_100GE_SR1 = '100gbase-sr1'
|
||||
TYPE_100GE_SR1_2 = '100gbase-sr1.2'
|
||||
TYPE_100GE_SR2 = '100gbase-sr2'
|
||||
TYPE_100GE_SR4 = '100gbase-sr4'
|
||||
TYPE_100GE_SR10 = '100gbase-sr10'
|
||||
TYPE_100GE_ZR = '100gbase-zr'
|
||||
|
||||
# 200 Gbps Ethernet
|
||||
TYPE_200GE_CR2 = '200gbase-cr2'
|
||||
TYPE_200GE_CR4 = '200gbase-cr4'
|
||||
TYPE_200GE_SR2 = '200gbase-sr2'
|
||||
TYPE_200GE_SR4 = '200gbase-sr4'
|
||||
TYPE_200GE_DR4 = '200gbase-dr4'
|
||||
TYPE_200GE_FR4 = '200gbase-fr4'
|
||||
TYPE_200GE_LR4 = '200gbase-lr4'
|
||||
TYPE_200GE_ER4 = '200gbase-er4'
|
||||
TYPE_200GE_VR2 = '200gbase-vr2'
|
||||
|
||||
# 400 Gbps Ethernet
|
||||
TYPE_400GE_CR4 = '400gbase-cr4'
|
||||
TYPE_400GE_DR4 = '400gbase-dr4'
|
||||
TYPE_400GE_ER8 = '400gbase-er8'
|
||||
TYPE_400GE_FR4 = '400gbase-fr4'
|
||||
TYPE_400GE_FR8 = '400gbase-fr8'
|
||||
TYPE_400GE_LR4 = '400gbase-lr4'
|
||||
TYPE_400GE_LR8 = '400gbase-lr8'
|
||||
TYPE_400GE_SR4 = '400gbase-sr4'
|
||||
TYPE_400GE_SR4_2 = '400gbase-sr4_2'
|
||||
TYPE_400GE_SR8 = '400gbase-sr8'
|
||||
TYPE_400GE_SR16 = '400gbase-sr16'
|
||||
TYPE_400GE_VR4 = '400gbase-vr4'
|
||||
TYPE_400GE_ZR = '400gbase-zr'
|
||||
|
||||
# 800 Gbps Ethernet
|
||||
TYPE_800GE_CR8 = '800gbase-cr8'
|
||||
TYPE_800GE_DR8 = '800gbase-dr8'
|
||||
TYPE_800GE_SR8 = '800gbase-sr8'
|
||||
TYPE_800GE_VR8 = '800gbase-vr8'
|
||||
|
||||
# Ethernet (modular)
|
||||
TYPE_100ME_SFP = '100base-x-sfp'
|
||||
TYPE_1GE_FIXED = '1000base-t'
|
||||
TYPE_1GE_SX_FIXED = '1000base-sx'
|
||||
TYPE_1GE_LX_FIXED = '1000base-lx'
|
||||
TYPE_1GE_TX_FIXED = '1000base-tx'
|
||||
TYPE_1GE_GBIC = '1000base-x-gbic'
|
||||
TYPE_1GE_SFP = '1000base-x-sfp'
|
||||
TYPE_2GE_FIXED = '2.5gbase-t'
|
||||
TYPE_5GE_FIXED = '5gbase-t'
|
||||
TYPE_10GE_FIXED = '10gbase-t'
|
||||
TYPE_10GE_CX4 = '10gbase-cx4'
|
||||
TYPE_10GE_SFP_PLUS = '10gbase-x-sfpp'
|
||||
TYPE_10GE_XFP = '10gbase-x-xfp'
|
||||
TYPE_10GE_XENPAK = '10gbase-x-xenpak'
|
||||
@@ -935,7 +1031,7 @@ class InterfaceTypeChoices(ChoiceSet):
|
||||
TYPE_800GE_QSFP_DD = '800gbase-x-qsfpdd'
|
||||
TYPE_800GE_OSFP = '800gbase-x-osfp'
|
||||
|
||||
# Ethernet Backplane
|
||||
# Backplane Ethernet
|
||||
TYPE_1GE_KX = '1000base-kx'
|
||||
TYPE_2GE_KX = '2.5gbase-kx'
|
||||
TYPE_5GE_KR = '5gbase-kr'
|
||||
@@ -1054,24 +1150,147 @@ class InterfaceTypeChoices(ChoiceSet):
|
||||
),
|
||||
),
|
||||
(
|
||||
_('Ethernet (fixed)'),
|
||||
_('FastEthernet (100 Mbps)'),
|
||||
(
|
||||
(TYPE_100ME_FX, '100BASE-FX (10/100ME FIBER)'),
|
||||
(TYPE_100ME_LFX, '100BASE-LFX (10/100ME FIBER)'),
|
||||
(TYPE_100ME_FX, '100BASE-FX (10/100ME)'),
|
||||
(TYPE_100ME_LFX, '100BASE-LFX (10/100ME)'),
|
||||
(TYPE_100ME_FIXED, '100BASE-TX (10/100ME)'),
|
||||
(TYPE_100ME_T1, '100BASE-T1 (10/100ME Single Pair)'),
|
||||
(TYPE_1GE_FIXED, '1000BASE-T (1GE)'),
|
||||
(TYPE_100ME_T1, '100BASE-T1 (10/100ME)'),
|
||||
),
|
||||
),
|
||||
(
|
||||
_('GigabitEthernet (1 Gbps)'),
|
||||
(
|
||||
(TYPE_1GE_BX10_D, '1000BASE-BX10-D (1GE BiDi Down)'),
|
||||
(TYPE_1GE_BX10_U, '1000BASE-BX10-U (1GE BiDi Up)'),
|
||||
(TYPE_1GE_CX, '1000BASE-CX (1GE DAC)'),
|
||||
(TYPE_1GE_CWDM, '1000BASE-CWDM (1GE)'),
|
||||
(TYPE_1GE_DWDM, '1000BASE-DWDM (1GE)'),
|
||||
(TYPE_1GE_EX, '1000BASE-EX (1GE)'),
|
||||
(TYPE_1GE_SX_FIXED, '1000BASE-SX (1GE)'),
|
||||
(TYPE_1GE_LSX, '1000BASE-LSX (1GE)'),
|
||||
(TYPE_1GE_LX_FIXED, '1000BASE-LX (1GE)'),
|
||||
(TYPE_1GE_LX10, '1000BASE-LX10/LH (1GE)'),
|
||||
(TYPE_1GE_FIXED, '1000BASE-T (1GE)'),
|
||||
(TYPE_1GE_TX_FIXED, '1000BASE-TX (1GE)'),
|
||||
(TYPE_1GE_ZX, '1000BASE-ZX (1GE)'),
|
||||
),
|
||||
),
|
||||
(
|
||||
_('2.5/5 Gbps Ethernet'),
|
||||
(
|
||||
(TYPE_2GE_FIXED, '2.5GBASE-T (2.5GE)'),
|
||||
(TYPE_5GE_FIXED, '5GBASE-T (5GE)'),
|
||||
),
|
||||
),
|
||||
(
|
||||
_('10 Gbps Ethernet'),
|
||||
(
|
||||
(TYPE_10GE_BR_D, '10GBASE-DR-D (10GE BiDi Down)'),
|
||||
(TYPE_10GE_BR_U, '10GBASE-DR-U (10GE BiDi Up)'),
|
||||
(TYPE_10GE_CX4, '10GBASE-CX4 (10GE DAC)'),
|
||||
(TYPE_10GE_ER, '10GBASE-ER (10GE)'),
|
||||
(TYPE_10GE_LR, '10GBASE-LR (10GE)'),
|
||||
(TYPE_10GE_LRM, '10GBASE-LRM (10GE)'),
|
||||
(TYPE_10GE_LX4, '10GBASE-LX4 (10GE)'),
|
||||
(TYPE_10GE_SR, '10GBASE-SR (10GE)'),
|
||||
(TYPE_10GE_FIXED, '10GBASE-T (10GE)'),
|
||||
(TYPE_10GE_CX4, '10GBASE-CX4 (10GE)'),
|
||||
(TYPE_10GE_ZR, '10GBASE-ZR (10GE)'),
|
||||
)
|
||||
),
|
||||
(
|
||||
_('Ethernet (modular)'),
|
||||
_('25 Gbps Ethernet'),
|
||||
(
|
||||
(TYPE_25GE_CR, '25GBASE-CR (25GE DAC)'),
|
||||
(TYPE_25GE_ER, '25GBASE-ER (25GE)'),
|
||||
(TYPE_25GE_LR, '25GBASE-LR (25GE)'),
|
||||
(TYPE_25GE_SR, '25GBASE-SR (25GE)'),
|
||||
(TYPE_25GE_T, '25GBASE-T (25GE)'),
|
||||
)
|
||||
),
|
||||
(
|
||||
_('40 Gbps Ethernet'),
|
||||
(
|
||||
(TYPE_40GE_CR4, '40GBASE-CR4 (40GE DAC)'),
|
||||
(TYPE_40GE_ER4, '40GBASE-ER4 (40GE)'),
|
||||
(TYPE_40GE_FR4, '40GBASE-FR4 (40GE)'),
|
||||
(TYPE_40GE_LR4, '40GBASE-LR4 (40GE)'),
|
||||
(TYPE_40GE_SR4, '40GBASE-SR4 (40GE)'),
|
||||
)
|
||||
),
|
||||
(
|
||||
_('50 Gbps Ethernet'),
|
||||
(
|
||||
(TYPE_50GE_CR, '50GBASE-CR (50GE DAC)'),
|
||||
(TYPE_50GE_ER, '50GBASE-ER (50GE)'),
|
||||
(TYPE_50GE_FR, '50GBASE-FR (50GE)'),
|
||||
(TYPE_50GE_LR, '50GBASE-LR (50GE)'),
|
||||
(TYPE_50GE_SR, '50GBASE-SR (50GE)'),
|
||||
)
|
||||
),
|
||||
(
|
||||
_('100 Gbps Ethernet'),
|
||||
(
|
||||
(TYPE_100GE_CR1, '100GBASE-CR1 (100GE DAC)'),
|
||||
(TYPE_100GE_CR2, '100GBASE-CR2 (100GE DAC)'),
|
||||
(TYPE_100GE_CR4, '100GBASE-CR4 (100GE DAC)'),
|
||||
(TYPE_100GE_CR10, '100GBASE-CR10 (100GE DAC)'),
|
||||
(TYPE_100GE_DR, '100GBASE-DR (100GE)'),
|
||||
(TYPE_100GE_ER4, '100GBASE-ER4 (100GE)'),
|
||||
(TYPE_100GE_FR1, '100GBASE-FR1 (100GE)'),
|
||||
(TYPE_100GE_LR1, '100GBASE-LR1 (100GE)'),
|
||||
(TYPE_100GE_LR4, '100GBASE-LR4 (100GE)'),
|
||||
(TYPE_100GE_SR1, '100GBASE-SR1 (100GE)'),
|
||||
(TYPE_100GE_SR1_2, '100GBASE-SR1.2 (100GE BiDi)'),
|
||||
(TYPE_100GE_SR2, '100GBASE-SR2 (100GE)'),
|
||||
(TYPE_100GE_SR4, '100GBASE-SR4 (100GE)'),
|
||||
(TYPE_100GE_SR10, '100GBASE-SR10 (100GE)'),
|
||||
(TYPE_100GE_ZR, '100GBASE-ZR (100GE)'),
|
||||
)
|
||||
),
|
||||
(
|
||||
_('200 Gbps Ethernet'),
|
||||
(
|
||||
(TYPE_200GE_CR2, '200GBASE-CR2 (200GE)'),
|
||||
(TYPE_200GE_CR4, '200GBASE-CR4 (200GE)'),
|
||||
(TYPE_200GE_SR2, '200GBASE-SR2 (200GE)'),
|
||||
(TYPE_200GE_SR4, '200GBASE-SR4 (200GE)'),
|
||||
(TYPE_200GE_DR4, '200GBASE-DR4 (200GE)'),
|
||||
(TYPE_200GE_ER4, '200GBASE-ER4 (200GE)'),
|
||||
(TYPE_200GE_FR4, '200GBASE-FR4 (200GE)'),
|
||||
(TYPE_200GE_LR4, '200GBASE-LR4 (200GE)'),
|
||||
(TYPE_200GE_VR2, '200GBASE-VR2 (200GE)'),
|
||||
)
|
||||
),
|
||||
(
|
||||
_('400 Gbps Ethernet'),
|
||||
(
|
||||
(TYPE_400GE_CR4, '400GBASE-CR4 (400GE)'),
|
||||
(TYPE_400GE_DR4, '400GBASE-DR4 (400GE)'),
|
||||
(TYPE_400GE_ER8, '400GBASE-ER8 (400GE)'),
|
||||
(TYPE_400GE_FR4, '400GBASE-FR4 (400GE)'),
|
||||
(TYPE_400GE_FR8, '400GBASE-FR8 (400GE)'),
|
||||
(TYPE_400GE_LR4, '400GBASE-LR4 (400GE)'),
|
||||
(TYPE_400GE_LR8, '400GBASE-LR8 (400GE)'),
|
||||
(TYPE_400GE_SR4, '400GBASE-SR4 (400GE)'),
|
||||
(TYPE_400GE_SR4_2, '400GBASE-SR4.2 (400GE BiDi)'),
|
||||
(TYPE_400GE_SR8, '400GBASE-SR8 (400GE)'),
|
||||
(TYPE_400GE_SR16, '400GBASE-SR16 (400GE)'),
|
||||
(TYPE_400GE_VR4, '400GBASE-VR4 (400GE)'),
|
||||
(TYPE_400GE_ZR, '400GBASE-ZR (400GE)'),
|
||||
)
|
||||
),
|
||||
(
|
||||
_('800 Gbps Ethernet'),
|
||||
(
|
||||
(TYPE_800GE_CR8, '800GBASE-CR8 (800GE)'),
|
||||
(TYPE_800GE_DR8, '800GBASE-DR8 (800GE)'),
|
||||
(TYPE_800GE_SR8, '800GBASE-SR8 (800GE)'),
|
||||
(TYPE_800GE_VR8, '800GBASE-VR8 (800GE)'),
|
||||
)
|
||||
),
|
||||
(
|
||||
_('Pluggable transceivers'),
|
||||
(
|
||||
(TYPE_100ME_SFP, 'SFP (100ME)'),
|
||||
(TYPE_1GE_GBIC, 'GBIC (1GE)'),
|
||||
@@ -1108,7 +1327,7 @@ class InterfaceTypeChoices(ChoiceSet):
|
||||
)
|
||||
),
|
||||
(
|
||||
_('Ethernet (backplane)'),
|
||||
_('Backplane Ethernet'),
|
||||
(
|
||||
(TYPE_1GE_KX, '1000BASE-KX (1GE)'),
|
||||
(TYPE_2GE_KX, '2.5GBASE-KX (2.5GE)'),
|
||||
@@ -1128,12 +1347,12 @@ class InterfaceTypeChoices(ChoiceSet):
|
||||
(
|
||||
(TYPE_80211A, 'IEEE 802.11a'),
|
||||
(TYPE_80211G, 'IEEE 802.11b/g'),
|
||||
(TYPE_80211N, 'IEEE 802.11n'),
|
||||
(TYPE_80211AC, 'IEEE 802.11ac'),
|
||||
(TYPE_80211AD, 'IEEE 802.11ad'),
|
||||
(TYPE_80211AX, 'IEEE 802.11ax'),
|
||||
(TYPE_80211AY, 'IEEE 802.11ay'),
|
||||
(TYPE_80211BE, 'IEEE 802.11be'),
|
||||
(TYPE_80211N, 'IEEE 802.11n (Wi-Fi 4)'),
|
||||
(TYPE_80211AC, 'IEEE 802.11ac (Wi-Fi 5)'),
|
||||
(TYPE_80211AD, 'IEEE 802.11ad (WiGig)'),
|
||||
(TYPE_80211AX, 'IEEE 802.11ax (Wi-Fi 6)'),
|
||||
(TYPE_80211AY, 'IEEE 802.11ay (WiGig)'),
|
||||
(TYPE_80211BE, 'IEEE 802.11be (Wi-Fi 7)'),
|
||||
(TYPE_802151, 'IEEE 802.15.1 (Bluetooth)'),
|
||||
(TYPE_802154, 'IEEE 802.15.4 (LR-WPAN)'),
|
||||
(TYPE_OTHER_WIRELESS, 'Other (Wireless)'),
|
||||
@@ -1497,8 +1716,9 @@ class PortTypeChoices(ChoiceSet):
|
||||
# Cables/links
|
||||
#
|
||||
|
||||
class CableTypeChoices(ChoiceSet):
|
||||
|
||||
class CableTypeChoices(ChoiceSet):
|
||||
# Copper - Twisted Pair (UTP/STP)
|
||||
TYPE_CAT3 = 'cat3'
|
||||
TYPE_CAT5 = 'cat5'
|
||||
TYPE_CAT5E = 'cat5e'
|
||||
@@ -1507,26 +1727,41 @@ class CableTypeChoices(ChoiceSet):
|
||||
TYPE_CAT7 = 'cat7'
|
||||
TYPE_CAT7A = 'cat7a'
|
||||
TYPE_CAT8 = 'cat8'
|
||||
TYPE_MRJ21_TRUNK = 'mrj21-trunk'
|
||||
|
||||
# Copper - Twinax (DAC)
|
||||
TYPE_DAC_ACTIVE = 'dac-active'
|
||||
TYPE_DAC_PASSIVE = 'dac-passive'
|
||||
TYPE_MRJ21_TRUNK = 'mrj21-trunk'
|
||||
|
||||
# Copper - Coaxial
|
||||
TYPE_COAXIAL = 'coaxial'
|
||||
|
||||
# Fiber Optic - Multimode
|
||||
TYPE_MMF = 'mmf'
|
||||
TYPE_MMF_OM1 = 'mmf-om1'
|
||||
TYPE_MMF_OM2 = 'mmf-om2'
|
||||
TYPE_MMF_OM3 = 'mmf-om3'
|
||||
TYPE_MMF_OM4 = 'mmf-om4'
|
||||
TYPE_MMF_OM5 = 'mmf-om5'
|
||||
|
||||
# Fiber Optic - Single-mode
|
||||
TYPE_SMF = 'smf'
|
||||
TYPE_SMF_OS1 = 'smf-os1'
|
||||
TYPE_SMF_OS2 = 'smf-os2'
|
||||
|
||||
# Fiber Optic - Other
|
||||
TYPE_AOC = 'aoc'
|
||||
|
||||
# Power
|
||||
TYPE_POWER = 'power'
|
||||
|
||||
# USB
|
||||
TYPE_USB = 'usb'
|
||||
|
||||
CHOICES = (
|
||||
(
|
||||
_('Copper'), (
|
||||
_('Copper - Twisted Pair (UTP/STP)'),
|
||||
(
|
||||
(TYPE_CAT3, 'CAT3'),
|
||||
(TYPE_CAT5, 'CAT5'),
|
||||
(TYPE_CAT5E, 'CAT5e'),
|
||||
@@ -1535,28 +1770,57 @@ class CableTypeChoices(ChoiceSet):
|
||||
(TYPE_CAT7, 'CAT7'),
|
||||
(TYPE_CAT7A, 'CAT7a'),
|
||||
(TYPE_CAT8, 'CAT8'),
|
||||
(TYPE_MRJ21_TRUNK, 'MRJ21 Trunk'),
|
||||
),
|
||||
),
|
||||
(
|
||||
_('Copper - Twinax (DAC)'),
|
||||
(
|
||||
(TYPE_DAC_ACTIVE, 'Direct Attach Copper (Active)'),
|
||||
(TYPE_DAC_PASSIVE, 'Direct Attach Copper (Passive)'),
|
||||
(TYPE_MRJ21_TRUNK, 'MRJ21 Trunk'),
|
||||
),
|
||||
),
|
||||
(
|
||||
_('Copper - Coaxial'),
|
||||
(
|
||||
(TYPE_COAXIAL, 'Coaxial'),
|
||||
),
|
||||
),
|
||||
(
|
||||
_('Fiber'), (
|
||||
_('Fiber - Multimode'),
|
||||
(
|
||||
(TYPE_MMF, 'Multimode Fiber'),
|
||||
(TYPE_MMF_OM1, 'Multimode Fiber (OM1)'),
|
||||
(TYPE_MMF_OM2, 'Multimode Fiber (OM2)'),
|
||||
(TYPE_MMF_OM3, 'Multimode Fiber (OM3)'),
|
||||
(TYPE_MMF_OM4, 'Multimode Fiber (OM4)'),
|
||||
(TYPE_MMF_OM5, 'Multimode Fiber (OM5)'),
|
||||
(TYPE_SMF, 'Singlemode Fiber'),
|
||||
(TYPE_SMF_OS1, 'Singlemode Fiber (OS1)'),
|
||||
(TYPE_SMF_OS2, 'Singlemode Fiber (OS2)'),
|
||||
(TYPE_AOC, 'Active Optical Cabling (AOC)'),
|
||||
),
|
||||
),
|
||||
(TYPE_USB, _('USB')),
|
||||
(TYPE_POWER, _('Power')),
|
||||
(
|
||||
_('Fiber - Single-mode'),
|
||||
(
|
||||
(TYPE_SMF, 'Single-mode Fiber'),
|
||||
(TYPE_SMF_OS1, 'Single-mode Fiber (OS1)'),
|
||||
(TYPE_SMF_OS2, 'Single-mode Fiber (OS2)'),
|
||||
),
|
||||
),
|
||||
(
|
||||
_('Fiber - Other'),
|
||||
((TYPE_AOC, 'Active Optical Cabling (AOC)'),),
|
||||
),
|
||||
(
|
||||
_('Power'),
|
||||
(
|
||||
(TYPE_POWER, 'Power'),
|
||||
),
|
||||
),
|
||||
(
|
||||
_('USB'),
|
||||
(
|
||||
(TYPE_USB, 'USB'),
|
||||
),
|
||||
),
|
||||
)
|
||||
|
||||
|
||||
|
||||
@@ -1918,6 +1918,16 @@ class InterfaceFilterSet(
|
||||
PathEndpointFilterSet,
|
||||
CommonInterfaceFilterSet
|
||||
):
|
||||
virtual_chassis_member_or_master = MultiValueCharFilter(
|
||||
method='filter_virtual_chassis_member_or_master',
|
||||
field_name='name',
|
||||
label=_('Virtual Chassis Interfaces for Device when device is master')
|
||||
)
|
||||
virtual_chassis_member_or_master_id = MultiValueNumberFilter(
|
||||
method='filter_virtual_chassis_member_or_master',
|
||||
field_name='pk',
|
||||
label=_('Virtual Chassis Interfaces for Device when device is master (ID)')
|
||||
)
|
||||
virtual_chassis_member = MultiValueCharFilter(
|
||||
method='filter_virtual_chassis_member',
|
||||
field_name='name',
|
||||
@@ -2028,11 +2038,14 @@ class InterfaceFilterSet(
|
||||
'cable_id', 'cable_end',
|
||||
)
|
||||
|
||||
def filter_virtual_chassis_member(self, queryset, name, value):
|
||||
def filter_virtual_chassis_member_or_master(self, queryset, name, value):
|
||||
return self.filter_virtual_chassis_member(queryset, name, value, if_master=True)
|
||||
|
||||
def filter_virtual_chassis_member(self, queryset, name, value, if_master=False):
|
||||
try:
|
||||
vc_interface_ids = []
|
||||
for device in Device.objects.filter(**{f'{name}__in': value}):
|
||||
vc_interface_ids.extend(device.vc_interfaces(if_master=False).values_list('id', flat=True))
|
||||
vc_interface_ids.extend(device.vc_interfaces(if_master=if_master).values_list('id', flat=True))
|
||||
return queryset.filter(pk__in=vc_interface_ids)
|
||||
except Device.DoesNotExist:
|
||||
return queryset.none()
|
||||
|
||||
@@ -69,11 +69,14 @@ class PowerPortBulkCreateForm(
|
||||
|
||||
|
||||
class PowerOutletBulkCreateForm(
|
||||
form_from_model(PowerOutlet, ['type', 'color', 'feed_leg', 'mark_connected']),
|
||||
form_from_model(PowerOutlet, ['type', 'status', 'color', 'feed_leg', 'mark_connected']),
|
||||
DeviceBulkAddComponentForm
|
||||
):
|
||||
model = PowerOutlet
|
||||
field_order = ('name', 'label', 'type', 'feed_leg', 'description', 'tags')
|
||||
field_order = (
|
||||
'name', 'label', 'type', 'status', 'color', 'feed_leg', 'mark_connected',
|
||||
'description', 'tags',
|
||||
)
|
||||
|
||||
|
||||
class InterfaceBulkCreateForm(
|
||||
|
||||
@@ -1181,7 +1181,7 @@ class InventoryItemImportForm(NetBoxModelImportForm):
|
||||
help_text=_('Component Type')
|
||||
)
|
||||
component_name = forms.CharField(
|
||||
label=_('Compnent name'),
|
||||
label=_('Component name'),
|
||||
required=False,
|
||||
help_text=_('Component Name')
|
||||
)
|
||||
|
||||
@@ -19,6 +19,11 @@ def get_cable_form(a_type, b_type):
|
||||
# Device component
|
||||
if hasattr(term_cls, 'device'):
|
||||
|
||||
# Dynamically change the param field for interfaces to use virtual_chassis filter
|
||||
query_param_device_field = 'device_id'
|
||||
if term_cls == Interface:
|
||||
query_param_device_field = 'virtual_chassis_member_or_master_id'
|
||||
|
||||
attrs[f'termination_{cable_end}_device'] = DynamicModelMultipleChoiceField(
|
||||
queryset=Device.objects.all(),
|
||||
label=_('Device'),
|
||||
@@ -36,7 +41,7 @@ def get_cable_form(a_type, b_type):
|
||||
'parent': 'device',
|
||||
},
|
||||
query_params={
|
||||
'device_id': f'$termination_{cable_end}_device',
|
||||
query_param_device_field: f'$termination_{cable_end}_device',
|
||||
'kind': 'physical', # Exclude virtual interfaces
|
||||
}
|
||||
)
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
from django import forms
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
from django.core.exceptions import ObjectDoesNotExist
|
||||
from django.core.exceptions import ObjectDoesNotExist, ValidationError
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from dcim.constants import LOCATION_SCOPE_TYPES
|
||||
@@ -48,8 +48,17 @@ class ScopedForm(forms.Form):
|
||||
def clean(self):
|
||||
super().clean()
|
||||
|
||||
scope = self.cleaned_data.get('scope')
|
||||
scope_type = self.cleaned_data.get('scope_type')
|
||||
if scope_type and not scope:
|
||||
raise ValidationError({
|
||||
'scope': _(
|
||||
"Please select a {scope_type}."
|
||||
).format(scope_type=scope_type.model_class()._meta.model_name)
|
||||
})
|
||||
|
||||
# Assign the selected scope (if any)
|
||||
self.instance.scope = self.cleaned_data.get('scope')
|
||||
self.instance.scope = scope
|
||||
|
||||
def _set_scoped_values(self):
|
||||
if scope_type_id := get_field_value(self, 'scope_type'):
|
||||
@@ -107,3 +116,15 @@ class ScopedImportForm(forms.Form):
|
||||
required=False,
|
||||
label=_('Scope type (app & model)')
|
||||
)
|
||||
|
||||
def clean(self):
|
||||
super().clean()
|
||||
|
||||
scope_id = self.cleaned_data.get('scope_id')
|
||||
scope_type = self.cleaned_data.get('scope_type')
|
||||
if scope_type and not scope_id:
|
||||
raise ValidationError({
|
||||
'scope_id': _(
|
||||
"Please select a {scope_type}."
|
||||
).format(scope_type=scope_type.model_class()._meta.model_name)
|
||||
})
|
||||
|
||||
@@ -1899,6 +1899,7 @@ class MACAddressForm(NetBoxModelForm):
|
||||
label=_('Interface'),
|
||||
queryset=Interface.objects.all(),
|
||||
required=False,
|
||||
selector=True,
|
||||
context={
|
||||
'parent': 'device',
|
||||
},
|
||||
@@ -1907,6 +1908,7 @@ class MACAddressForm(NetBoxModelForm):
|
||||
label=_('VM Interface'),
|
||||
queryset=VMInterface.objects.all(),
|
||||
required=False,
|
||||
selector=True,
|
||||
context={
|
||||
'parent': 'virtual_machine',
|
||||
},
|
||||
|
||||
@@ -12,6 +12,7 @@ __all__ = (
|
||||
'DeviceFaceEnum',
|
||||
'DeviceStatusEnum',
|
||||
'InterfaceDuplexEnum',
|
||||
'InterfaceKindEnum',
|
||||
'InterfaceModeEnum',
|
||||
'InterfacePoEModeEnum',
|
||||
'InterfacePoETypeEnum',
|
||||
@@ -48,6 +49,7 @@ DeviceAirflowEnum = strawberry.enum(DeviceAirflowChoices.as_enum(prefix='airflow
|
||||
DeviceFaceEnum = strawberry.enum(DeviceFaceChoices.as_enum(prefix='face'))
|
||||
DeviceStatusEnum = strawberry.enum(DeviceStatusChoices.as_enum(prefix='status'))
|
||||
InterfaceDuplexEnum = strawberry.enum(InterfaceDuplexChoices.as_enum(prefix='duplex'))
|
||||
InterfaceKindEnum = strawberry.enum(InterfaceKindChoices.as_enum(prefix='kind'))
|
||||
InterfaceModeEnum = strawberry.enum(InterfaceModeChoices.as_enum(prefix='mode'))
|
||||
InterfacePoEModeEnum = strawberry.enum(InterfacePoEModeChoices.as_enum(prefix='mode'))
|
||||
InterfacePoETypeEnum = strawberry.enum(InterfacePoETypeChoices.as_enum())
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
from typing import Annotated, TYPE_CHECKING
|
||||
|
||||
from django.db.models import Q
|
||||
import strawberry
|
||||
import strawberry_django
|
||||
from strawberry.scalars import ID
|
||||
@@ -7,6 +8,8 @@ from strawberry_django import FilterLookup
|
||||
|
||||
from core.graphql.filter_mixins import ChangeLogFilterMixin
|
||||
from dcim import models
|
||||
from dcim.constants import *
|
||||
from dcim.graphql.enums import InterfaceKindEnum
|
||||
from extras.graphql.filter_mixins import ConfigContextFilterMixin
|
||||
from netbox.graphql.filter_mixins import (
|
||||
PrimaryModelFilterMixin,
|
||||
@@ -485,6 +488,27 @@ class InterfaceFilter(ModularComponentModelFilterMixin, InterfaceBaseFilterMixin
|
||||
strawberry_django.filter_field()
|
||||
)
|
||||
|
||||
@strawberry_django.filter_field
|
||||
def connected(self, queryset, value: bool, prefix: str):
|
||||
if value is True:
|
||||
return queryset, Q(**{f"{prefix}_path__is_active": True})
|
||||
else:
|
||||
return queryset, Q(**{f"{prefix}_path__isnull": True}) | Q(**{f"{prefix}_path__is_active": False})
|
||||
|
||||
@strawberry_django.filter_field
|
||||
def kind(
|
||||
self,
|
||||
queryset,
|
||||
value: Annotated['InterfaceKindEnum', strawberry.lazy('dcim.graphql.enums')],
|
||||
prefix: str
|
||||
):
|
||||
if value == InterfaceKindEnum.KIND_PHYSICAL:
|
||||
return queryset, ~Q(**{f"{prefix}type__in": NONCONNECTABLE_IFACE_TYPES})
|
||||
elif value == InterfaceKindEnum.KIND_VIRTUAL:
|
||||
return queryset, Q(**{f"{prefix}type__in": VIRTUAL_IFACE_TYPES})
|
||||
elif value == InterfaceKindEnum.KIND_WIRELESS:
|
||||
return queryset, Q(**{f"{prefix}type__in": WIRELESS_IFACE_TYPES})
|
||||
|
||||
|
||||
@strawberry_django.filter_type(models.InterfaceTemplate, lookups=True)
|
||||
class InterfaceTemplateFilter(ModularComponentTemplateFilterMixin):
|
||||
|
||||
@@ -18,6 +18,7 @@ from utilities.conversion import to_meters
|
||||
from utilities.exceptions import AbortRequest
|
||||
from utilities.fields import ColorField, GenericArrayForeignKey
|
||||
from utilities.querysets import RestrictedQuerySet
|
||||
from utilities.serialization import deserialize_object, serialize_object
|
||||
from wireless.models import WirelessLink
|
||||
from .device_components import FrontPort, RearPort, PathEndpoint
|
||||
|
||||
@@ -119,43 +120,61 @@ class Cable(PrimaryModel):
|
||||
pk = self.pk or self._pk
|
||||
return self.label or f'#{pk}'
|
||||
|
||||
@property
|
||||
def a_terminations(self):
|
||||
if hasattr(self, '_a_terminations'):
|
||||
return self._a_terminations
|
||||
def get_status_color(self):
|
||||
return LinkStatusChoices.colors.get(self.status)
|
||||
|
||||
def _get_x_terminations(self, side):
|
||||
"""
|
||||
Return the terminating objects for the given cable end (A or B).
|
||||
"""
|
||||
if side not in (CableEndChoices.SIDE_A, CableEndChoices.SIDE_B):
|
||||
raise ValueError(f"Unknown cable side: {side}")
|
||||
attr = f'_{side.lower()}_terminations'
|
||||
|
||||
if hasattr(self, attr):
|
||||
return getattr(self, attr)
|
||||
if not self.pk:
|
||||
return []
|
||||
|
||||
# Query self.terminations.all() to leverage cached results
|
||||
return [
|
||||
ct.termination for ct in self.terminations.all() if ct.cable_end == CableEndChoices.SIDE_A
|
||||
# Query self.terminations.all() to leverage cached results
|
||||
ct.termination for ct in self.terminations.all() if ct.cable_end == side
|
||||
]
|
||||
|
||||
def _set_x_terminations(self, side, value):
|
||||
"""
|
||||
Set the terminating objects for the given cable end (A or B).
|
||||
"""
|
||||
if side not in (CableEndChoices.SIDE_A, CableEndChoices.SIDE_B):
|
||||
raise ValueError(f"Unknown cable side: {side}")
|
||||
_attr = f'_{side.lower()}_terminations'
|
||||
|
||||
# If the provided value is a list of CableTermination IDs, resolve them
|
||||
# to their corresponding termination objects.
|
||||
if all(isinstance(item, int) for item in value):
|
||||
value = [
|
||||
ct.termination for ct in CableTermination.objects.filter(pk__in=value).prefetch_related('termination')
|
||||
]
|
||||
|
||||
if not self.pk or getattr(self, _attr, []) != list(value):
|
||||
self._terminations_modified = True
|
||||
|
||||
setattr(self, _attr, value)
|
||||
|
||||
@property
|
||||
def a_terminations(self):
|
||||
return self._get_x_terminations(CableEndChoices.SIDE_A)
|
||||
|
||||
@a_terminations.setter
|
||||
def a_terminations(self, value):
|
||||
if not self.pk or self.a_terminations != list(value):
|
||||
self._terminations_modified = True
|
||||
self._a_terminations = value
|
||||
self._set_x_terminations(CableEndChoices.SIDE_A, value)
|
||||
|
||||
@property
|
||||
def b_terminations(self):
|
||||
if hasattr(self, '_b_terminations'):
|
||||
return self._b_terminations
|
||||
|
||||
if not self.pk:
|
||||
return []
|
||||
|
||||
# Query self.terminations.all() to leverage cached results
|
||||
return [
|
||||
ct.termination for ct in self.terminations.all() if ct.cable_end == CableEndChoices.SIDE_B
|
||||
]
|
||||
return self._get_x_terminations(CableEndChoices.SIDE_B)
|
||||
|
||||
@b_terminations.setter
|
||||
def b_terminations(self, value):
|
||||
if not self.pk or self.b_terminations != list(value):
|
||||
self._terminations_modified = True
|
||||
self._b_terminations = value
|
||||
self._set_x_terminations(CableEndChoices.SIDE_B, value)
|
||||
|
||||
@property
|
||||
def color_name(self):
|
||||
@@ -208,7 +227,7 @@ class Cable(PrimaryModel):
|
||||
for termination in self.b_terminations:
|
||||
CableTermination(cable=self, cable_end='B', termination=termination).clean()
|
||||
|
||||
def save(self, *args, **kwargs):
|
||||
def save(self, *args, force_insert=False, force_update=False, using=None, update_fields=None):
|
||||
_created = self.pk is None
|
||||
|
||||
# Store the given length (if any) in meters for use in database ordering
|
||||
@@ -221,39 +240,87 @@ class Cable(PrimaryModel):
|
||||
if self.length is None:
|
||||
self.length_unit = None
|
||||
|
||||
super().save(*args, **kwargs)
|
||||
# If this is a new Cable, save it before attempting to create its CableTerminations
|
||||
if self._state.adding:
|
||||
super().save(*args, force_insert=True, using=using, update_fields=update_fields)
|
||||
# Update the private PK used in __str__()
|
||||
self._pk = self.pk
|
||||
|
||||
# Update the private pk used in __str__ in case this is a new object (i.e. just got its pk)
|
||||
self._pk = self.pk
|
||||
|
||||
# Retrieve existing A/B terminations for the Cable
|
||||
a_terminations = {ct.termination: ct for ct in self.terminations.filter(cable_end='A')}
|
||||
b_terminations = {ct.termination: ct for ct in self.terminations.filter(cable_end='B')}
|
||||
|
||||
# Delete stale CableTerminations
|
||||
if self._terminations_modified:
|
||||
for termination, ct in a_terminations.items():
|
||||
if termination.pk and termination not in self.a_terminations:
|
||||
ct.delete()
|
||||
for termination, ct in b_terminations.items():
|
||||
if termination.pk and termination not in self.b_terminations:
|
||||
ct.delete()
|
||||
self.update_terminations()
|
||||
|
||||
super().save(*args, force_update=True, using=using, update_fields=update_fields)
|
||||
|
||||
# Save new CableTerminations (if any)
|
||||
if self._terminations_modified:
|
||||
for termination in self.a_terminations:
|
||||
if not termination.pk or termination not in a_terminations:
|
||||
CableTermination(cable=self, cable_end='A', termination=termination).save()
|
||||
for termination in self.b_terminations:
|
||||
if not termination.pk or termination not in b_terminations:
|
||||
CableTermination(cable=self, cable_end='B', termination=termination).save()
|
||||
try:
|
||||
trace_paths.send(Cable, instance=self, created=_created)
|
||||
except UnsupportedCablePath as e:
|
||||
raise AbortRequest(e)
|
||||
|
||||
def get_status_color(self):
|
||||
return LinkStatusChoices.colors.get(self.status)
|
||||
def serialize_object(self, exclude=None):
|
||||
data = serialize_object(self, exclude=exclude or [])
|
||||
|
||||
# Add A & B terminations to the serialized data
|
||||
a_terminations, b_terminations = self.get_terminations()
|
||||
data['a_terminations'] = sorted([ct.pk for ct in a_terminations.values()])
|
||||
data['b_terminations'] = sorted([ct.pk for ct in b_terminations.values()])
|
||||
|
||||
return data
|
||||
|
||||
@classmethod
|
||||
def deserialize_object(cls, data, pk=None):
|
||||
a_terminations = data.pop('a_terminations', [])
|
||||
b_terminations = data.pop('b_terminations', [])
|
||||
|
||||
instance = deserialize_object(cls, data, pk=pk)
|
||||
|
||||
# Assign A & B termination objects to the Cable instance
|
||||
queryset = CableTermination.objects.prefetch_related('termination')
|
||||
instance.a_terminations = [
|
||||
ct.termination for ct in queryset.filter(pk__in=a_terminations)
|
||||
]
|
||||
instance.b_terminations = [
|
||||
ct.termination for ct in queryset.filter(pk__in=b_terminations)
|
||||
]
|
||||
|
||||
return instance
|
||||
|
||||
def get_terminations(self):
|
||||
"""
|
||||
Return two dictionaries mapping A & B side terminating objects to their corresponding CableTerminations
|
||||
for this Cable.
|
||||
"""
|
||||
a_terminations = {}
|
||||
b_terminations = {}
|
||||
|
||||
for ct in CableTermination.objects.filter(cable=self).prefetch_related('termination'):
|
||||
if ct.cable_end == CableEndChoices.SIDE_A:
|
||||
a_terminations[ct.termination] = ct
|
||||
else:
|
||||
b_terminations[ct.termination] = ct
|
||||
|
||||
return a_terminations, b_terminations
|
||||
|
||||
def update_terminations(self):
|
||||
"""
|
||||
Create/delete CableTerminations for this Cable to reflect its current state.
|
||||
"""
|
||||
a_terminations, b_terminations = self.get_terminations()
|
||||
|
||||
# Delete any stale CableTerminations
|
||||
for termination, ct in a_terminations.items():
|
||||
if termination.pk and termination not in self.a_terminations:
|
||||
ct.delete()
|
||||
for termination, ct in b_terminations.items():
|
||||
if termination.pk and termination not in self.b_terminations:
|
||||
ct.delete()
|
||||
|
||||
# Save any new CableTerminations
|
||||
for termination in self.a_terminations:
|
||||
if not termination.pk or termination not in a_terminations:
|
||||
CableTermination(cable=self, cable_end='A', termination=termination).save()
|
||||
for termination in self.b_terminations:
|
||||
if not termination.pk or termination not in b_terminations:
|
||||
CableTermination(cable=self, cable_end='B', termination=termination).save()
|
||||
|
||||
|
||||
class CableTermination(ChangeLoggedModel):
|
||||
|
||||
@@ -872,14 +872,14 @@ class Interface(ModularComponentModel, BaseInterface, CabledObjectModel, PathEnd
|
||||
"The selected parent interface ({interface}) belongs to a different device ({device})"
|
||||
).format(interface=self.parent, device=self.parent.device)
|
||||
})
|
||||
elif self.parent.device.virtual_chassis != self.parent.virtual_chassis:
|
||||
elif self.parent.device.virtual_chassis != self.device.virtual_chassis:
|
||||
raise ValidationError({
|
||||
'parent': _(
|
||||
"The selected parent interface ({interface}) belongs to {device}, which is not part of "
|
||||
"virtual chassis {virtual_chassis}."
|
||||
).format(
|
||||
interface=self.parent,
|
||||
device=self.parent_device,
|
||||
device=self.parent.device,
|
||||
virtual_chassis=self.device.virtual_chassis
|
||||
)
|
||||
})
|
||||
@@ -890,7 +890,7 @@ class Interface(ModularComponentModel, BaseInterface, CabledObjectModel, PathEnd
|
||||
if self.pk and self.bridge_id == self.pk:
|
||||
raise ValidationError({'bridge': _("An interface cannot be bridged to itself.")})
|
||||
|
||||
# A bridged interface belong to the same device or virtual chassis
|
||||
# A bridged interface belongs to the same device or virtual chassis
|
||||
if self.bridge and self.bridge.device != self.device:
|
||||
if self.device.virtual_chassis is None:
|
||||
raise ValidationError({
|
||||
|
||||
@@ -87,11 +87,9 @@ class CachedScopeMixin(models.Model):
|
||||
def clean(self):
|
||||
if self.scope_type and not (self.scope or self.scope_id):
|
||||
scope_type = self.scope_type.model_class()
|
||||
raise ValidationError({
|
||||
'scope': _(
|
||||
"Please select a {scope_type}."
|
||||
).format(scope_type=scope_type._meta.model_name)
|
||||
})
|
||||
raise ValidationError(
|
||||
_("Please select a {scope_type}.").format(scope_type=scope_type._meta.model_name)
|
||||
)
|
||||
super().clean()
|
||||
|
||||
def save(self, *args, **kwargs):
|
||||
|
||||
@@ -195,6 +195,11 @@ class DeviceTable(TenancyColumnsMixin, ContactsColumnMixin, NetBoxTable):
|
||||
linkify=True,
|
||||
verbose_name=_('Type')
|
||||
)
|
||||
u_height = columns.TemplateColumn(
|
||||
accessor=tables.A('device_type.u_height'),
|
||||
verbose_name=_('U Height'),
|
||||
template_code='{{ value|floatformat }}'
|
||||
)
|
||||
platform = tables.Column(
|
||||
linkify=True,
|
||||
verbose_name=_('Platform')
|
||||
|
||||
@@ -4444,6 +4444,9 @@ class InterfaceTestCase(TestCase, DeviceComponentFilterSetTests, ChangeLoggedFil
|
||||
)
|
||||
Device.objects.bulk_create(devices)
|
||||
|
||||
virtual_chassis.master = devices[0]
|
||||
virtual_chassis.save()
|
||||
|
||||
module_bays = (
|
||||
ModuleBay(device=devices[0], name='Module Bay 1'),
|
||||
ModuleBay(device=devices[1], name='Module Bay 2'),
|
||||
@@ -4830,6 +4833,19 @@ class InterfaceTestCase(TestCase, DeviceComponentFilterSetTests, ChangeLoggedFil
|
||||
params = {'device': [devices[0].name, devices[1].name]}
|
||||
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 2)
|
||||
|
||||
def test_virtual_chassis_member_or_master(self):
|
||||
vc = VirtualChassis.objects.first()
|
||||
master = vc.master
|
||||
member = vc.members.exclude(pk=master.pk).first()
|
||||
params = {'virtual_chassis_member_or_master_id': [master.pk,]}
|
||||
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 2)
|
||||
params = {'virtual_chassis_member_or_master_id': [member.pk,]}
|
||||
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 1)
|
||||
params = {'virtual_chassis_member_or_master': [master.name,]}
|
||||
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 2)
|
||||
params = {'virtual_chassis_member_or_master': [member.name,]}
|
||||
self.assertEqual(self.filterset(params, self.queryset).qs.count(), 1)
|
||||
|
||||
def test_virtual_chassis_member(self):
|
||||
# Device 1A & 3 have 1 management interface, Device 1B has 1 interfaces
|
||||
devices = Device.objects.filter(name__in=['Device 1A', 'Device 3'])
|
||||
|
||||
@@ -1078,14 +1078,14 @@ class ModuleTypeTestCase(ViewTestCases.PrimaryObjectViewTestCase):
|
||||
'dcim.add_modulebaytemplate',
|
||||
)
|
||||
|
||||
def verify_module_type_profile(scenario_name):
|
||||
# TODO: remove extra regression asserts once parent test supports testing all import fields
|
||||
fan_module_type = ModuleType.objects.get(part_number='generic-fan')
|
||||
fan_module_type_profile = ModuleTypeProfile.objects.get(name='Fan')
|
||||
assert fan_module_type.profile == fan_module_type_profile
|
||||
|
||||
# run base test
|
||||
super().test_bulk_import_objects_with_permission()
|
||||
|
||||
# TODO: remove extra regression asserts once parent test supports testing all import fields
|
||||
fan_module_type = ModuleType.objects.get(part_number='generic-fan')
|
||||
fan_module_type_profile = ModuleTypeProfile.objects.get(name='Fan')
|
||||
|
||||
assert fan_module_type.profile == fan_module_type_profile
|
||||
super().test_bulk_import_objects_with_permission(post_import_callback=verify_module_type_profile)
|
||||
|
||||
@override_settings(EXEMPT_VIEW_PERMISSIONS=['*'], EXEMPT_EXCLUDE_MODELS=[])
|
||||
def test_bulk_import_objects_with_constrained_permission(self):
|
||||
@@ -3290,8 +3290,10 @@ class CableTestCase(
|
||||
Device(name='Device 1', site=sites[0], device_type=devicetype, role=role),
|
||||
Device(name='Device 2', site=sites[0], device_type=devicetype, role=role),
|
||||
Device(name='Device 3', site=sites[0], device_type=devicetype, role=role),
|
||||
Device(name='Device 4', site=sites[0], device_type=devicetype, role=role),
|
||||
# Create 'Device 1' assigned to 'Site 2' (allowed since the site is different)
|
||||
Device(name='Device 1', site=sites[1], device_type=devicetype, role=role),
|
||||
Device(name='Device 5', site=sites[1], device_type=devicetype, role=role),
|
||||
)
|
||||
Device.objects.bulk_create(devices)
|
||||
|
||||
@@ -3300,22 +3302,36 @@ class CableTestCase(
|
||||
vc.save()
|
||||
|
||||
interfaces = (
|
||||
# Device 1, Site 1
|
||||
Interface(device=devices[0], name='Interface 1', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
|
||||
Interface(device=devices[0], name='Interface 2', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
|
||||
Interface(device=devices[0], name='Interface 3', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
|
||||
# Device 2, Site 1
|
||||
Interface(device=devices[1], name='Interface 1', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
|
||||
Interface(device=devices[1], name='Interface 2', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
|
||||
Interface(device=devices[1], name='Interface 3', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
|
||||
# Device 3, Site 1
|
||||
Interface(device=devices[2], name='Interface 1', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
|
||||
Interface(device=devices[2], name='Interface 2', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
|
||||
Interface(device=devices[2], name='Interface 3', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
|
||||
# Device 3, Site 1
|
||||
Interface(device=devices[3], name='Interface 1', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
|
||||
Interface(device=devices[3], name='Interface 2', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
|
||||
Interface(device=devices[3], name='Interface 3', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
|
||||
# Device 1, Site 2
|
||||
Interface(device=devices[4], name='Interface 1', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
|
||||
Interface(device=devices[4], name='Interface 2', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
|
||||
Interface(device=devices[4], name='Interface 3', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
|
||||
|
||||
# Device 1, Site 2
|
||||
Interface(device=devices[5], name='Interface 1', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
|
||||
Interface(device=devices[5], name='Interface 2', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
|
||||
Interface(device=devices[5], name='Interface 3', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
|
||||
|
||||
Interface(device=devices[1], name='Device 2 Interface', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
|
||||
Interface(device=devices[2], name='Device 3 Interface', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
|
||||
Interface(device=devices[3], name='Interface 4', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
|
||||
Interface(device=devices[3], name='Interface 5', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
|
||||
Interface(device=devices[4], name='Interface 4', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
|
||||
Interface(device=devices[4], name='Interface 5', type=InterfaceTypeChoices.TYPE_1GE_FIXED),
|
||||
)
|
||||
Interface.objects.bulk_create(interfaces)
|
||||
|
||||
@@ -3342,16 +3358,29 @@ class CableTestCase(
|
||||
'tags': [t.pk for t in tags],
|
||||
}
|
||||
|
||||
# Ensure that CSV bulk import supports assigning terminations from parent devices that share
|
||||
# the same device name, provided those devices belong to different sites.
|
||||
cls.csv_data = (
|
||||
"side_a_site,side_a_device,side_a_type,side_a_name,side_b_site,side_b_device,side_b_type,side_b_name",
|
||||
"Site 1,Device 3,dcim.interface,Interface 1,Site 2,Device 1,dcim.interface,Interface 1",
|
||||
"Site 1,Device 3,dcim.interface,Interface 2,Site 2,Device 1,dcim.interface,Interface 2",
|
||||
"Site 1,Device 3,dcim.interface,Interface 3,Site 2,Device 1,dcim.interface,Interface 3",
|
||||
"Site 1,Device 1,dcim.interface,Device 2 Interface,Site 2,Device 1,dcim.interface,Interface 4",
|
||||
"Site 1,Device 1,dcim.interface,Device 3 Interface,Site 2,Device 1,dcim.interface,Interface 5",
|
||||
)
|
||||
cls.csv_data = {
|
||||
'default': (
|
||||
"side_a_device,side_a_type,side_a_name,side_b_device,side_b_type,side_b_name",
|
||||
"Device 4,dcim.interface,Interface 1,Device 5,dcim.interface,Interface 1",
|
||||
"Device 3,dcim.interface,Interface 2,Device 4,dcim.interface,Interface 2",
|
||||
"Device 3,dcim.interface,Interface 3,Device 4,dcim.interface,Interface 3",
|
||||
|
||||
# The following is no longer possible in this scenario, because there are multiple
|
||||
# devices named "Device 1" across multiple sites. See the "site-filtering" scenario
|
||||
# below for how to specify a site for non-unique device names.
|
||||
# "Device 1,dcim.interface,Device 3 Interface,Device 4,dcim.interface,Interface 5",
|
||||
),
|
||||
'site-filtering': (
|
||||
# Ensure that CSV bulk import supports assigning terminations from parent devices
|
||||
# that share the same device name, provided those devices belong to different sites.
|
||||
"side_a_site,side_a_device,side_a_type,side_a_name,side_b_site,side_b_device,side_b_type,side_b_name",
|
||||
"Site 1,Device 3,dcim.interface,Interface 1,Site 2,Device 1,dcim.interface,Interface 1",
|
||||
"Site 1,Device 3,dcim.interface,Interface 2,Site 2,Device 1,dcim.interface,Interface 2",
|
||||
"Site 1,Device 3,dcim.interface,Interface 3,Site 2,Device 1,dcim.interface,Interface 3",
|
||||
"Site 1,Device 1,dcim.interface,Device 2 Interface,Site 2,Device 1,dcim.interface,Interface 4",
|
||||
"Site 1,Device 1,dcim.interface,Device 3 Interface,Site 2,Device 1,dcim.interface,Interface 5",
|
||||
)
|
||||
}
|
||||
|
||||
cls.csv_update_data = (
|
||||
"id,label,color",
|
||||
|
||||
@@ -2040,9 +2040,18 @@ class InventoryItemTemplateBulkDeleteView(generic.BulkDeleteView):
|
||||
|
||||
@register_model_view(DeviceRole, 'list', path='', detail=False)
|
||||
class DeviceRoleListView(generic.ObjectListView):
|
||||
queryset = DeviceRole.objects.annotate(
|
||||
device_count=count_related(Device, 'role'),
|
||||
vm_count=count_related(VirtualMachine, 'role')
|
||||
queryset = DeviceRole.objects.add_related_count(
|
||||
DeviceRole.objects.add_related_count(
|
||||
DeviceRole.objects.all(),
|
||||
VirtualMachine,
|
||||
'role',
|
||||
'vm_count',
|
||||
cumulative=True
|
||||
),
|
||||
Device,
|
||||
'role',
|
||||
'device_count',
|
||||
cumulative=True
|
||||
)
|
||||
filterset = filtersets.DeviceRoleFilterSet
|
||||
filterset_form = forms.DeviceRoleFilterForm
|
||||
|
||||
@@ -76,11 +76,11 @@ class CustomFieldBulkEditForm(ChangelogMessageMixin, BulkEditForm):
|
||||
required=False,
|
||||
widget=BulkEditNullBooleanSelect()
|
||||
)
|
||||
validation_minimum = forms.IntegerField(
|
||||
validation_minimum = forms.DecimalField(
|
||||
label=_('Minimum value'),
|
||||
required=False,
|
||||
)
|
||||
validation_maximum = forms.IntegerField(
|
||||
validation_maximum = forms.DecimalField(
|
||||
label=_('Maximum value'),
|
||||
required=False,
|
||||
)
|
||||
|
||||
@@ -103,11 +103,11 @@ class CustomFieldFilterForm(SavedFiltersMixin, FilterForm):
|
||||
choices=BOOLEAN_WITH_BLANK_CHOICES
|
||||
)
|
||||
)
|
||||
validation_minimum = forms.IntegerField(
|
||||
validation_minimum = forms.DecimalField(
|
||||
label=_('Minimum value'),
|
||||
required=False
|
||||
)
|
||||
validation_maximum = forms.IntegerField(
|
||||
validation_maximum = forms.DecimalField(
|
||||
label=_('Maximum value'),
|
||||
required=False
|
||||
)
|
||||
|
||||
@@ -17,7 +17,7 @@ if TYPE_CHECKING:
|
||||
)
|
||||
from tenancy.graphql.filters import TenantFilter, TenantGroupFilter
|
||||
from netbox.graphql.enums import ColorEnum
|
||||
from netbox.graphql.filter_lookups import IntegerLookup, JSONFilter, StringArrayLookup, TreeNodeFilter
|
||||
from netbox.graphql.filter_lookups import FloatLookup, IntegerLookup, JSONFilter, StringArrayLookup, TreeNodeFilter
|
||||
from users.graphql.filters import GroupFilter, UserFilter
|
||||
from virtualization.graphql.filters import ClusterFilter, ClusterGroupFilter, ClusterTypeFilter
|
||||
from .enums import *
|
||||
@@ -43,12 +43,12 @@ __all__ = (
|
||||
|
||||
@strawberry_django.filter_type(models.ConfigContext, lookups=True)
|
||||
class ConfigContextFilter(BaseObjectTypeFilterMixin, SyncedDataFilterMixin, ChangeLogFilterMixin):
|
||||
name: FilterLookup[str] = strawberry_django.filter_field()
|
||||
name: FilterLookup[str] | None = strawberry_django.filter_field()
|
||||
weight: Annotated['IntegerLookup', strawberry.lazy('netbox.graphql.filter_lookups')] | None = (
|
||||
strawberry_django.filter_field()
|
||||
)
|
||||
description: FilterLookup[str] = strawberry_django.filter_field()
|
||||
is_active: FilterLookup[bool] = strawberry_django.filter_field()
|
||||
description: FilterLookup[str] | None = strawberry_django.filter_field()
|
||||
is_active: FilterLookup[bool] | None = strawberry_django.filter_field()
|
||||
regions: Annotated['RegionFilter', strawberry.lazy('dcim.graphql.filters')] | None = (
|
||||
strawberry_django.filter_field()
|
||||
)
|
||||
@@ -151,10 +151,10 @@ class CustomFieldFilter(BaseObjectTypeFilterMixin, ChangeLogFilterMixin):
|
||||
weight: Annotated['IntegerLookup', strawberry.lazy('netbox.graphql.filter_lookups')] | None = (
|
||||
strawberry_django.filter_field()
|
||||
)
|
||||
validation_minimum: Annotated['IntegerLookup', strawberry.lazy('netbox.graphql.filter_lookups')] | None = (
|
||||
validation_minimum: Annotated['FloatLookup', strawberry.lazy('netbox.graphql.filter_lookups')] | None = (
|
||||
strawberry_django.filter_field()
|
||||
)
|
||||
validation_maximum: Annotated['IntegerLookup', strawberry.lazy('netbox.graphql.filter_lookups')] | None = (
|
||||
validation_maximum: Annotated['FloatLookup', strawberry.lazy('netbox.graphql.filter_lookups')] | None = (
|
||||
strawberry_django.filter_field()
|
||||
)
|
||||
validation_regex: FilterLookup[str] | None = strawberry_django.filter_field()
|
||||
|
||||
@@ -106,7 +106,7 @@ class ScriptJob(JobRunner):
|
||||
|
||||
# Add the current request as a property of the script
|
||||
script.request = request
|
||||
self.logger.debug(f"Request ID: {request.id}")
|
||||
self.logger.debug(f"Request ID: {request.id if request else None}")
|
||||
|
||||
# Execute the script. If commit is True, wrap it with the event_tracking context manager to ensure we process
|
||||
# change logging, event rules, etc.
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
from django.db.models import CharField, Lookup
|
||||
from django.db.models import CharField, JSONField, Lookup
|
||||
from django.db.models.fields.json import KeyTextTransform
|
||||
|
||||
from .fields import CachedValueField
|
||||
|
||||
@@ -18,6 +19,30 @@ class Empty(Lookup):
|
||||
return f"CAST(LENGTH({sql}) AS BOOLEAN) IS TRUE", params
|
||||
|
||||
|
||||
class JSONEmpty(Lookup):
|
||||
"""
|
||||
Support "empty" lookups for JSONField keys.
|
||||
|
||||
A key is considered empty if it is "", null, or does not exist.
|
||||
"""
|
||||
lookup_name = "empty"
|
||||
|
||||
def as_sql(self, compiler, connection):
|
||||
# self.lhs.lhs is the parent expression (could be a JSONField or another KeyTransform)
|
||||
# Rebuild the expression using KeyTextTransform to guarantee ->> (text)
|
||||
text_expr = KeyTextTransform(self.lhs.key_name, self.lhs.lhs)
|
||||
lhs_sql, lhs_params = compiler.compile(text_expr)
|
||||
|
||||
value = self.rhs
|
||||
if value not in (True, False):
|
||||
raise ValueError("The 'empty' lookup only accepts True or False.")
|
||||
|
||||
condition = '' if value else 'NOT '
|
||||
sql = f"(NULLIF({lhs_sql}, '') IS {condition}NULL)"
|
||||
|
||||
return sql, lhs_params
|
||||
|
||||
|
||||
class NetHost(Lookup):
|
||||
"""
|
||||
Similar to ipam.lookups.NetHost, but casts the field to INET.
|
||||
@@ -45,5 +70,6 @@ class NetContainsOrEquals(Lookup):
|
||||
|
||||
|
||||
CharField.register_lookup(Empty)
|
||||
JSONField.register_lookup(JSONEmpty)
|
||||
CachedValueField.register_lookup(NetHost)
|
||||
CachedValueField.register_lookup(NetContainsOrEquals)
|
||||
|
||||
21
netbox/extras/migrations/0133_make_cf_minmax_decimal.py
Normal file
21
netbox/extras/migrations/0133_make_cf_minmax_decimal.py
Normal file
@@ -0,0 +1,21 @@
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('extras', '0132_configcontextprofile'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name='customfield',
|
||||
name='validation_maximum',
|
||||
field=models.DecimalField(blank=True, decimal_places=4, max_digits=16, null=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='customfield',
|
||||
name='validation_minimum',
|
||||
field=models.DecimalField(blank=True, decimal_places=4, max_digits=16, null=True),
|
||||
),
|
||||
]
|
||||
@@ -174,13 +174,17 @@ class CustomField(CloningMixin, ExportTemplatesMixin, ChangeLoggedModel):
|
||||
verbose_name=_('display weight'),
|
||||
help_text=_('Fields with higher weights appear lower in a form.')
|
||||
)
|
||||
validation_minimum = models.BigIntegerField(
|
||||
validation_minimum = models.DecimalField(
|
||||
max_digits=16,
|
||||
decimal_places=4,
|
||||
blank=True,
|
||||
null=True,
|
||||
verbose_name=_('minimum value'),
|
||||
help_text=_('Minimum allowed value (for numeric fields)')
|
||||
)
|
||||
validation_maximum = models.BigIntegerField(
|
||||
validation_maximum = models.DecimalField(
|
||||
max_digits=16,
|
||||
decimal_places=4,
|
||||
blank=True,
|
||||
null=True,
|
||||
verbose_name=_('maximum value'),
|
||||
@@ -471,7 +475,7 @@ class CustomField(CloningMixin, ExportTemplatesMixin, ChangeLoggedModel):
|
||||
field = forms.DecimalField(
|
||||
required=required,
|
||||
initial=initial,
|
||||
max_digits=12,
|
||||
max_digits=16,
|
||||
decimal_places=4,
|
||||
min_value=self.validation_minimum,
|
||||
max_value=self.validation_maximum
|
||||
@@ -534,7 +538,7 @@ class CustomField(CloningMixin, ExportTemplatesMixin, ChangeLoggedModel):
|
||||
|
||||
# JSON
|
||||
elif self.type == CustomFieldTypeChoices.TYPE_JSON:
|
||||
field = JSONField(required=required, initial=json.dumps(initial) if initial else None)
|
||||
field = JSONField(required=required, initial=json.dumps(initial) if initial is not None else None)
|
||||
|
||||
# Object
|
||||
elif self.type == CustomFieldTypeChoices.TYPE_OBJECT:
|
||||
@@ -600,11 +604,19 @@ class CustomField(CloningMixin, ExportTemplatesMixin, ChangeLoggedModel):
|
||||
kwargs = {
|
||||
'field_name': f'custom_field_data__{self.name}'
|
||||
}
|
||||
# Native numeric filters will use `isnull` by default for empty lookups, but
|
||||
# JSON fields require `empty` (see bug #20012).
|
||||
if lookup_expr == 'isnull':
|
||||
lookup_expr = 'empty'
|
||||
if lookup_expr is not None:
|
||||
kwargs['lookup_expr'] = lookup_expr
|
||||
|
||||
# 'Empty' lookup is always a boolean
|
||||
if lookup_expr == 'empty':
|
||||
filter_class = django_filters.BooleanFilter
|
||||
|
||||
# Text/URL
|
||||
if self.type in (
|
||||
elif self.type in (
|
||||
CustomFieldTypeChoices.TYPE_TEXT,
|
||||
CustomFieldTypeChoices.TYPE_LONGTEXT,
|
||||
CustomFieldTypeChoices.TYPE_URL,
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import json
|
||||
import os
|
||||
import urllib.parse
|
||||
from pathlib import Path
|
||||
|
||||
from django.conf import settings
|
||||
from django.contrib.contenttypes.fields import GenericForeignKey, GenericRelation
|
||||
@@ -728,7 +728,9 @@ class ImageAttachment(ChangeLoggedModel):
|
||||
|
||||
@property
|
||||
def filename(self):
|
||||
return os.path.basename(self.image.name).split('_', 2)[2]
|
||||
base_name = Path(self.image.name).name
|
||||
prefix = f"{self.object_type.model}_{self.object_id}_"
|
||||
return base_name.removeprefix(prefix)
|
||||
|
||||
@property
|
||||
def html_tag(self):
|
||||
|
||||
@@ -173,14 +173,17 @@ class NotificationGroup(ChangeLoggedModel):
|
||||
User.objects.filter(groups__in=self.groups.all())
|
||||
).order_by('username')
|
||||
|
||||
def notify(self, **kwargs):
|
||||
def notify(self, object_type, object_id, **kwargs):
|
||||
"""
|
||||
Bulk-create Notifications for all members of this group.
|
||||
"""
|
||||
Notification.objects.bulk_create([
|
||||
Notification(user=member, **kwargs)
|
||||
for member in self.members
|
||||
])
|
||||
for user in self.members:
|
||||
Notification.objects.update_or_create(
|
||||
object_type=object_type,
|
||||
object_id=object_id,
|
||||
user=user,
|
||||
defaults=kwargs
|
||||
)
|
||||
notify.alters_data = True
|
||||
|
||||
|
||||
|
||||
@@ -22,9 +22,10 @@ class ConfigContextQuerySet(RestrictedQuerySet):
|
||||
aggregate_data: If True, use the JSONBAgg aggregate function to return only the list of JSON data objects
|
||||
"""
|
||||
|
||||
# Device type and location assignment is relevant only for Devices
|
||||
# Device type and location assignment are relevant only for Devices
|
||||
device_type = getattr(obj, 'device_type', None)
|
||||
location = getattr(obj, 'location', None)
|
||||
locations = location.get_ancestors(include_self=True) if location else []
|
||||
|
||||
# Get assigned cluster, group, and type (if any)
|
||||
cluster = getattr(obj, 'cluster', None)
|
||||
@@ -49,7 +50,7 @@ class ConfigContextQuerySet(RestrictedQuerySet):
|
||||
Q(regions__in=regions) | Q(regions=None),
|
||||
Q(site_groups__in=sitegroups) | Q(site_groups=None),
|
||||
Q(sites=obj.site) | Q(sites=None),
|
||||
Q(locations=location) | Q(locations=None),
|
||||
Q(locations__in=locations) | Q(locations=None),
|
||||
Q(device_types=device_type) | Q(device_types=None),
|
||||
Q(roles__in=device_roles) | Q(roles=None),
|
||||
Q(platforms=obj.platform) | Q(platforms=None),
|
||||
@@ -92,7 +93,7 @@ class ConfigContextModelQuerySet(RestrictedQuerySet):
|
||||
_data=EmptyGroupByJSONBAgg('data', ordering=['weight', 'name'])
|
||||
).values("_data").order_by()
|
||||
)
|
||||
).distinct()
|
||||
)
|
||||
|
||||
def _get_config_context_filters(self):
|
||||
# Construct the set of Q objects for the specific object types
|
||||
@@ -116,7 +117,7 @@ class ConfigContextModelQuerySet(RestrictedQuerySet):
|
||||
).values_list(
|
||||
'tag_id',
|
||||
flat=True
|
||||
)
|
||||
).distinct()
|
||||
)
|
||||
) | Q(tags=None),
|
||||
is_active=True,
|
||||
@@ -124,7 +125,15 @@ class ConfigContextModelQuerySet(RestrictedQuerySet):
|
||||
|
||||
# Apply Location & DeviceType filters only for VirtualMachines
|
||||
if self.model._meta.model_name == 'device':
|
||||
base_query.add((Q(locations=OuterRef('location')) | Q(locations=None)), Q.AND)
|
||||
base_query.add(
|
||||
(Q(
|
||||
locations__tree_id=OuterRef('location__tree_id'),
|
||||
locations__level__lte=OuterRef('location__level'),
|
||||
locations__lft__lte=OuterRef('location__lft'),
|
||||
locations__rght__gte=OuterRef('location__rght'),
|
||||
) | Q(locations=None)),
|
||||
Q.AND
|
||||
)
|
||||
base_query.add((Q(device_types=OuterRef('device_type')) | Q(device_types=None)), Q.AND)
|
||||
elif self.model._meta.model_name == 'virtualmachine':
|
||||
base_query.add(Q(locations=None), Q.AND)
|
||||
|
||||
@@ -725,8 +725,9 @@ class ScriptResultsTable(BaseTable):
|
||||
index = tables.Column(
|
||||
verbose_name=_('Line')
|
||||
)
|
||||
time = tables.Column(
|
||||
verbose_name=_('Time')
|
||||
time = columns.DateTimeColumn(
|
||||
verbose_name=_('Time'),
|
||||
timespec='seconds'
|
||||
)
|
||||
status = tables.TemplateColumn(
|
||||
template_code="""{% load log_levels %}{% log_level record.status %}""",
|
||||
|
||||
@@ -1,7 +1,9 @@
|
||||
import datetime
|
||||
import json
|
||||
from decimal import Decimal
|
||||
|
||||
from django.core.exceptions import ValidationError
|
||||
from django.test import tag
|
||||
from django.urls import reverse
|
||||
from rest_framework import status
|
||||
|
||||
@@ -269,6 +271,60 @@ class CustomFieldTest(TestCase):
|
||||
instance.refresh_from_db()
|
||||
self.assertIsNone(instance.custom_field_data.get(cf.name))
|
||||
|
||||
@tag('regression')
|
||||
def test_json_field_falsy_defaults(self):
|
||||
"""Test that falsy JSON default values are properly handled"""
|
||||
falsy_test_cases = [
|
||||
({}, 'empty_dict'),
|
||||
([], 'empty_array'),
|
||||
(0, 'zero'),
|
||||
(False, 'false_bool'),
|
||||
("", 'empty_string'),
|
||||
]
|
||||
|
||||
for default, suffix in falsy_test_cases:
|
||||
with self.subTest(default=default, suffix=suffix):
|
||||
cf = CustomField.objects.create(
|
||||
name=f'json_falsy_{suffix}',
|
||||
type=CustomFieldTypeChoices.TYPE_JSON,
|
||||
default=default,
|
||||
required=False
|
||||
)
|
||||
cf.object_types.set([self.object_type])
|
||||
|
||||
instance = Site.objects.create(name=f'Test Site {suffix}', slug=f'test-site-{suffix}')
|
||||
|
||||
self.assertIsNotNone(instance.custom_field_data)
|
||||
self.assertIn(cf.name, instance.custom_field_data)
|
||||
|
||||
instance.refresh_from_db()
|
||||
stored = instance.custom_field_data[cf.name]
|
||||
self.assertEqual(stored, default)
|
||||
|
||||
@tag('regression')
|
||||
def test_json_field_falsy_to_form_field(self):
|
||||
"""Test form field generation preserves falsy defaults"""
|
||||
falsy_test_cases = (
|
||||
({}, json.dumps({}), 'empty_dict'),
|
||||
([], json.dumps([]), 'empty_array'),
|
||||
(0, json.dumps(0), 'zero'),
|
||||
(False, json.dumps(False), 'false_bool'),
|
||||
("", '""', 'empty_string'),
|
||||
)
|
||||
|
||||
for default, expected, suffix in falsy_test_cases:
|
||||
with self.subTest(default=default, expected=expected, suffix=suffix):
|
||||
cf = CustomField.objects.create(
|
||||
name=f'json_falsy_{suffix}',
|
||||
type=CustomFieldTypeChoices.TYPE_JSON,
|
||||
default=default,
|
||||
required=False
|
||||
)
|
||||
cf.object_types.set([self.object_type])
|
||||
|
||||
form_field = cf.to_form_field(set_initial=True)
|
||||
self.assertEqual(form_field.initial, expected)
|
||||
|
||||
def test_select_field(self):
|
||||
CHOICES = (
|
||||
('a', 'Option A'),
|
||||
@@ -1615,6 +1671,7 @@ class CustomFieldModelFilterTest(TestCase):
|
||||
'cf11': manufacturers[2].pk,
|
||||
'cf12': [manufacturers[2].pk, manufacturers[3].pk],
|
||||
}),
|
||||
Site(name='Site 4', slug='site-4'),
|
||||
])
|
||||
|
||||
def test_filter_integer(self):
|
||||
@@ -1624,6 +1681,7 @@ class CustomFieldModelFilterTest(TestCase):
|
||||
self.assertEqual(self.filterset({'cf_cf1__gte': [200]}, self.queryset).qs.count(), 2)
|
||||
self.assertEqual(self.filterset({'cf_cf1__lt': [200]}, self.queryset).qs.count(), 1)
|
||||
self.assertEqual(self.filterset({'cf_cf1__lte': [200]}, self.queryset).qs.count(), 2)
|
||||
self.assertEqual(self.filterset({'cf_cf1__empty': True}, self.queryset).qs.count(), 1)
|
||||
|
||||
def test_filter_decimal(self):
|
||||
self.assertEqual(self.filterset({'cf_cf2': [100.1, 200.2]}, self.queryset).qs.count(), 2)
|
||||
@@ -1632,6 +1690,7 @@ class CustomFieldModelFilterTest(TestCase):
|
||||
self.assertEqual(self.filterset({'cf_cf2__gte': [200.2]}, self.queryset).qs.count(), 2)
|
||||
self.assertEqual(self.filterset({'cf_cf2__lt': [200.2]}, self.queryset).qs.count(), 1)
|
||||
self.assertEqual(self.filterset({'cf_cf2__lte': [200.2]}, self.queryset).qs.count(), 2)
|
||||
self.assertEqual(self.filterset({'cf_cf2__empty': True}, self.queryset).qs.count(), 1)
|
||||
|
||||
def test_filter_boolean(self):
|
||||
self.assertEqual(self.filterset({'cf_cf3': True}, self.queryset).qs.count(), 2)
|
||||
@@ -1648,6 +1707,7 @@ class CustomFieldModelFilterTest(TestCase):
|
||||
self.assertEqual(self.filterset({'cf_cf4__niew': ['bar']}, self.queryset).qs.count(), 1)
|
||||
self.assertEqual(self.filterset({'cf_cf4__ie': ['FOO']}, self.queryset).qs.count(), 1)
|
||||
self.assertEqual(self.filterset({'cf_cf4__nie': ['FOO']}, self.queryset).qs.count(), 2)
|
||||
self.assertEqual(self.filterset({'cf_cf4__empty': True}, self.queryset).qs.count(), 1)
|
||||
|
||||
def test_filter_text_loose(self):
|
||||
self.assertEqual(self.filterset({'cf_cf5': ['foo']}, self.queryset).qs.count(), 2)
|
||||
@@ -1659,6 +1719,7 @@ class CustomFieldModelFilterTest(TestCase):
|
||||
self.assertEqual(self.filterset({'cf_cf6__gte': ['2016-06-27']}, self.queryset).qs.count(), 2)
|
||||
self.assertEqual(self.filterset({'cf_cf6__lt': ['2016-06-27']}, self.queryset).qs.count(), 1)
|
||||
self.assertEqual(self.filterset({'cf_cf6__lte': ['2016-06-27']}, self.queryset).qs.count(), 2)
|
||||
self.assertEqual(self.filterset({'cf_cf6__empty': True}, self.queryset).qs.count(), 1)
|
||||
|
||||
def test_filter_url_strict(self):
|
||||
self.assertEqual(
|
||||
@@ -1674,17 +1735,20 @@ class CustomFieldModelFilterTest(TestCase):
|
||||
self.assertEqual(self.filterset({'cf_cf7__niew': ['.com']}, self.queryset).qs.count(), 0)
|
||||
self.assertEqual(self.filterset({'cf_cf7__ie': ['HTTP://A.EXAMPLE.COM']}, self.queryset).qs.count(), 1)
|
||||
self.assertEqual(self.filterset({'cf_cf7__nie': ['HTTP://A.EXAMPLE.COM']}, self.queryset).qs.count(), 2)
|
||||
self.assertEqual(self.filterset({'cf_cf7__empty': True}, self.queryset).qs.count(), 1)
|
||||
|
||||
def test_filter_url_loose(self):
|
||||
self.assertEqual(self.filterset({'cf_cf8': ['example.com']}, self.queryset).qs.count(), 3)
|
||||
|
||||
def test_filter_select(self):
|
||||
self.assertEqual(self.filterset({'cf_cf9': ['A', 'B']}, self.queryset).qs.count(), 2)
|
||||
self.assertEqual(self.filterset({'cf_cf9__empty': True}, self.queryset).qs.count(), 1)
|
||||
|
||||
def test_filter_multiselect(self):
|
||||
self.assertEqual(self.filterset({'cf_cf10': ['A']}, self.queryset).qs.count(), 1)
|
||||
self.assertEqual(self.filterset({'cf_cf10': ['A', 'C']}, self.queryset).qs.count(), 2)
|
||||
self.assertEqual(self.filterset({'cf_cf10': ['null']}, self.queryset).qs.count(), 1)
|
||||
self.assertEqual(self.filterset({'cf_cf10': ['null']}, self.queryset).qs.count(), 1) # Contains a literal null
|
||||
self.assertEqual(self.filterset({'cf_cf10__empty': True}, self.queryset).qs.count(), 2)
|
||||
|
||||
def test_filter_object(self):
|
||||
manufacturer_ids = Manufacturer.objects.values_list('id', flat=True)
|
||||
@@ -1692,6 +1756,7 @@ class CustomFieldModelFilterTest(TestCase):
|
||||
self.filterset({'cf_cf11': [manufacturer_ids[0], manufacturer_ids[1]]}, self.queryset).qs.count(),
|
||||
2
|
||||
)
|
||||
self.assertEqual(self.filterset({'cf_cf11__empty': True}, self.queryset).qs.count(), 1)
|
||||
|
||||
def test_filter_multiobject(self):
|
||||
manufacturer_ids = Manufacturer.objects.values_list('id', flat=True)
|
||||
@@ -1703,3 +1768,4 @@ class CustomFieldModelFilterTest(TestCase):
|
||||
self.filterset({'cf_cf12': [manufacturer_ids[3]]}, self.queryset).qs.count(),
|
||||
3
|
||||
)
|
||||
self.assertEqual(self.filterset({'cf_cf12__empty': True}, self.queryset).qs.count(), 1)
|
||||
|
||||
@@ -1,17 +1,95 @@
|
||||
import tempfile
|
||||
from pathlib import Path
|
||||
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
from django.core.files.uploadedfile import SimpleUploadedFile
|
||||
from django.forms import ValidationError
|
||||
from django.test import tag, TestCase
|
||||
|
||||
from core.models import DataSource, ObjectType
|
||||
from dcim.models import Device, DeviceRole, DeviceType, Location, Manufacturer, Platform, Region, Site, SiteGroup
|
||||
from extras.models import ConfigContext, ConfigContextProfile, ConfigTemplate, Tag
|
||||
from extras.models import ConfigContext, ConfigContextProfile, ConfigTemplate, ImageAttachment, Tag, TaggedItem
|
||||
from tenancy.models import Tenant, TenantGroup
|
||||
from utilities.exceptions import AbortRequest
|
||||
from virtualization.models import Cluster, ClusterGroup, ClusterType, VirtualMachine
|
||||
|
||||
|
||||
class ImageAttachmentTests(TestCase):
|
||||
@classmethod
|
||||
def setUpTestData(cls):
|
||||
cls.ct_rack = ContentType.objects.get(app_label='dcim', model='rack')
|
||||
cls.image_content = b''
|
||||
|
||||
def _stub_image_attachment(self, object_id, image_filename, name=None):
|
||||
"""
|
||||
Creates an instance of ImageAttachment with the provided object_id and image_name.
|
||||
|
||||
This method prepares a stubbed image attachment to test functionalities that
|
||||
require an ImageAttachment object.
|
||||
The function initializes the attachment with a specified file name and
|
||||
pre-defined image content.
|
||||
"""
|
||||
ia = ImageAttachment(
|
||||
object_type=self.ct_rack,
|
||||
object_id=object_id,
|
||||
name=name,
|
||||
image=SimpleUploadedFile(
|
||||
name=image_filename,
|
||||
content=self.image_content,
|
||||
content_type='image/jpeg',
|
||||
),
|
||||
)
|
||||
return ia
|
||||
|
||||
def test_filename_strips_expected_prefix(self):
|
||||
"""
|
||||
Tests that the filename of the image attachment is stripped of the expected
|
||||
prefix.
|
||||
"""
|
||||
ia = self._stub_image_attachment(12, 'image-attachments/rack_12_My_File.png')
|
||||
self.assertEqual(ia.filename, 'My_File.png')
|
||||
|
||||
def test_filename_legacy_nested_path_returns_basename(self):
|
||||
"""
|
||||
Tests if the filename of a legacy-nested path correctly returns only the basename.
|
||||
"""
|
||||
# e.g. "image-attachments/rack_12_5/31/23.jpg" -> "23.jpg"
|
||||
ia = self._stub_image_attachment(12, 'image-attachments/rack_12_5/31/23.jpg')
|
||||
self.assertEqual(ia.filename, '23.jpg')
|
||||
|
||||
def test_filename_no_prefix_returns_basename(self):
|
||||
"""
|
||||
Tests that the filename property correctly returns the basename for an image
|
||||
attachment that has no leading prefix in its path.
|
||||
"""
|
||||
ia = self._stub_image_attachment(42, 'image-attachments/just_name.webp')
|
||||
self.assertEqual(ia.filename, 'just_name.webp')
|
||||
|
||||
def test_mismatched_prefix_is_not_stripped(self):
|
||||
"""
|
||||
Tests that a mismatched prefix in the filename is not stripped.
|
||||
"""
|
||||
# Prefix does not match object_id -> leave as-is (basename only)
|
||||
ia = self._stub_image_attachment(12, 'image-attachments/rack_13_other.png')
|
||||
self.assertEqual('rack_13_other.png', ia.filename)
|
||||
|
||||
def test_str_uses_name_when_present(self):
|
||||
"""
|
||||
Tests that the `str` representation of the object uses the
|
||||
`name` attribute when provided.
|
||||
"""
|
||||
ia = self._stub_image_attachment(12, 'image-attachments/rack_12_file.png', name='Human title')
|
||||
self.assertEqual('Human title', str(ia))
|
||||
|
||||
def test_str_falls_back_to_filename(self):
|
||||
"""
|
||||
Tests that the `str` representation of the object falls back to
|
||||
the filename if the name attribute is not set.
|
||||
"""
|
||||
ia = self._stub_image_attachment(12, 'image-attachments/rack_12_file.png', name='')
|
||||
self.assertEqual('file.png', str(ia))
|
||||
|
||||
|
||||
class TagTest(TestCase):
|
||||
|
||||
def test_default_ordering_weight_then_name_is_set(self):
|
||||
@@ -445,7 +523,7 @@ class ConfigContextTest(TestCase):
|
||||
vm1 = VirtualMachine.objects.create(name="VM 1", site=site, role=vm_role)
|
||||
vm2 = VirtualMachine.objects.create(name="VM 2", cluster=cluster, role=vm_role)
|
||||
|
||||
# Check that their individually-rendered config contexts are identical
|
||||
# Check that their individually rendered config contexts are identical
|
||||
self.assertEqual(
|
||||
vm1.get_config_context(),
|
||||
vm2.get_config_context()
|
||||
@@ -458,11 +536,39 @@ class ConfigContextTest(TestCase):
|
||||
vms[1].get_config_context()
|
||||
)
|
||||
|
||||
def test_valid_local_context_data(self):
|
||||
device = Device.objects.first()
|
||||
device.local_context_data = None
|
||||
device.clean()
|
||||
|
||||
device.local_context_data = {"foo": "bar"}
|
||||
device.clean()
|
||||
|
||||
def test_invalid_local_context_data(self):
|
||||
device = Device.objects.first()
|
||||
|
||||
device.local_context_data = ""
|
||||
with self.assertRaises(ValidationError):
|
||||
device.clean()
|
||||
|
||||
device.local_context_data = 0
|
||||
with self.assertRaises(ValidationError):
|
||||
device.clean()
|
||||
|
||||
device.local_context_data = False
|
||||
with self.assertRaises(ValidationError):
|
||||
device.clean()
|
||||
|
||||
device.local_context_data = 'foo'
|
||||
with self.assertRaises(ValidationError):
|
||||
device.clean()
|
||||
|
||||
@tag('regression')
|
||||
def test_multiple_tags_return_distinct_objects(self):
|
||||
"""
|
||||
Tagged items use a generic relationship, which results in duplicate rows being returned when queried.
|
||||
This is combated by appending distinct() to the config context querysets. This test creates a config
|
||||
context assigned to two tags and ensures objects related by those same two tags result in only a single
|
||||
context assigned to two tags and ensures objects related to those same two tags result in only a single
|
||||
config context record being returned.
|
||||
|
||||
See https://github.com/netbox-community/netbox/issues/5314
|
||||
@@ -495,14 +601,15 @@ class ConfigContextTest(TestCase):
|
||||
self.assertEqual(ConfigContext.objects.get_for_object(device).count(), 1)
|
||||
self.assertEqual(device.get_config_context(), annotated_queryset[0].get_config_context())
|
||||
|
||||
def test_multiple_tags_return_distinct_objects_with_seperate_config_contexts(self):
|
||||
@tag('regression')
|
||||
def test_multiple_tags_return_distinct_objects_with_separate_config_contexts(self):
|
||||
"""
|
||||
Tagged items use a generic relationship, which results in duplicate rows being returned when queried.
|
||||
This is combatted by by appending distinct() to the config context querysets. This test creates a config
|
||||
context assigned to two tags and ensures objects related by those same two tags result in only a single
|
||||
This is combated by appending distinct() to the config context querysets. This test creates a config
|
||||
context assigned to two tags and ensures objects related to those same two tags result in only a single
|
||||
config context record being returned.
|
||||
|
||||
This test case is seperate from the above in that it deals with multiple config context objects in play.
|
||||
This test case is separate from the above in that it deals with multiple config context objects in play.
|
||||
|
||||
See https://github.com/netbox-community/netbox/issues/5387
|
||||
"""
|
||||
@@ -543,32 +650,47 @@ class ConfigContextTest(TestCase):
|
||||
self.assertEqual(ConfigContext.objects.get_for_object(device).count(), 2)
|
||||
self.assertEqual(device.get_config_context(), annotated_queryset[0].get_config_context())
|
||||
|
||||
def test_valid_local_context_data(self):
|
||||
@tag('performance', 'regression')
|
||||
def test_config_context_annotation_query_optimization(self):
|
||||
"""
|
||||
Regression test for issue #20327: Ensure config context annotation
|
||||
doesn't use expensive DISTINCT on main query.
|
||||
|
||||
Verifies that DISTINCT is only used in tag subquery where needed,
|
||||
not on the main device query which is expensive for large datasets.
|
||||
"""
|
||||
device = Device.objects.first()
|
||||
device.local_context_data = None
|
||||
device.clean()
|
||||
queryset = Device.objects.filter(pk=device.pk).annotate_config_context_data()
|
||||
|
||||
device.local_context_data = {"foo": "bar"}
|
||||
device.clean()
|
||||
# Main device query should NOT use DISTINCT
|
||||
self.assertFalse(queryset.query.distinct)
|
||||
|
||||
def test_invalid_local_context_data(self):
|
||||
device = Device.objects.first()
|
||||
# Check that tag subqueries DO use DISTINCT by inspecting the annotation
|
||||
config_annotation = queryset.query.annotations.get('config_context_data')
|
||||
self.assertIsNotNone(config_annotation)
|
||||
|
||||
device.local_context_data = ""
|
||||
with self.assertRaises(ValidationError):
|
||||
device.clean()
|
||||
def find_tag_subqueries(where_node):
|
||||
"""Find subqueries in WHERE clause that relate to tag filtering"""
|
||||
subqueries = []
|
||||
|
||||
device.local_context_data = 0
|
||||
with self.assertRaises(ValidationError):
|
||||
device.clean()
|
||||
def traverse(node):
|
||||
if hasattr(node, 'children'):
|
||||
for child in node.children:
|
||||
try:
|
||||
if child.rhs.query.model is TaggedItem:
|
||||
subqueries.append(child.rhs.query)
|
||||
except AttributeError:
|
||||
traverse(child)
|
||||
traverse(where_node)
|
||||
return subqueries
|
||||
|
||||
device.local_context_data = False
|
||||
with self.assertRaises(ValidationError):
|
||||
device.clean()
|
||||
# Find subqueries in the WHERE clause that should have DISTINCT
|
||||
tag_subqueries = find_tag_subqueries(config_annotation.query.where)
|
||||
distinct_subqueries = [sq for sq in tag_subqueries if sq.distinct]
|
||||
|
||||
device.local_context_data = 'foo'
|
||||
with self.assertRaises(ValidationError):
|
||||
device.clean()
|
||||
# Verify we found at least one DISTINCT subquery for tags
|
||||
self.assertEqual(len(distinct_subqueries), 1)
|
||||
self.assertTrue(distinct_subqueries[0].distinct)
|
||||
|
||||
|
||||
class ConfigTemplateTest(TestCase):
|
||||
|
||||
@@ -1,7 +1,10 @@
|
||||
from types import SimpleNamespace
|
||||
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
from django.test import TestCase
|
||||
|
||||
from extras.models import ExportTemplate
|
||||
from extras.utils import filename_from_model
|
||||
from extras.utils import filename_from_model, image_upload
|
||||
from tenancy.models import ContactGroup, TenantGroup
|
||||
from wireless.models import WirelessLANGroup
|
||||
|
||||
@@ -17,3 +20,141 @@ class FilenameFromModelTests(TestCase):
|
||||
|
||||
for model, expected in cases:
|
||||
self.assertEqual(filename_from_model(model), expected)
|
||||
|
||||
|
||||
class ImageUploadTests(TestCase):
|
||||
@classmethod
|
||||
def setUpTestData(cls):
|
||||
# We only need a ContentType with model="rack" for the prefix;
|
||||
# this doesn't require creating a Rack object.
|
||||
cls.ct_rack = ContentType.objects.get(app_label='dcim', model='rack')
|
||||
|
||||
def _stub_instance(self, object_id=12, name=None):
|
||||
"""
|
||||
Creates a minimal stub for use with the `image_upload()` function.
|
||||
|
||||
This method generates an instance of `SimpleNamespace` containing a set
|
||||
of attributes required to simulate the expected input for the
|
||||
`image_upload()` method.
|
||||
It is designed to simplify testing or processing by providing a
|
||||
lightweight representation of an object.
|
||||
"""
|
||||
return SimpleNamespace(object_type=self.ct_rack, object_id=object_id, name=name)
|
||||
|
||||
def _second_segment(self, path: str):
|
||||
"""
|
||||
Extracts and returns the portion of the input string after the
|
||||
first '/' character.
|
||||
"""
|
||||
return path.split('/', 1)[1]
|
||||
|
||||
def test_windows_fake_path_and_extension_lowercased(self):
|
||||
"""
|
||||
Tests handling of a Windows file path with a fake directory and extension.
|
||||
"""
|
||||
inst = self._stub_instance(name=None)
|
||||
path = image_upload(inst, r'C:\fake_path\MyPhoto.JPG')
|
||||
# Base directory and single-level path
|
||||
seg2 = self._second_segment(path)
|
||||
self.assertTrue(path.startswith('image-attachments/rack_12_'))
|
||||
self.assertNotIn('/', seg2, 'should not create nested directories')
|
||||
# Extension from the uploaded file, lowercased
|
||||
self.assertTrue(seg2.endswith('.jpg'))
|
||||
|
||||
def test_name_with_slashes_is_flattened_no_subdirectories(self):
|
||||
"""
|
||||
Tests that a name with slashes is flattened and does not
|
||||
create subdirectories.
|
||||
"""
|
||||
inst = self._stub_instance(name='5/31/23')
|
||||
path = image_upload(inst, 'image.png')
|
||||
seg2 = self._second_segment(path)
|
||||
self.assertTrue(seg2.startswith('rack_12_'))
|
||||
self.assertNotIn('/', seg2)
|
||||
self.assertNotIn('\\', seg2)
|
||||
self.assertTrue(seg2.endswith('.png'))
|
||||
|
||||
def test_name_with_backslashes_is_flattened_no_subdirectories(self):
|
||||
"""
|
||||
Tests that a name including backslashes is correctly flattened
|
||||
into a single directory name without creating subdirectories.
|
||||
"""
|
||||
inst = self._stub_instance(name=r'5\31\23')
|
||||
path = image_upload(inst, 'image_name.png')
|
||||
|
||||
seg2 = self._second_segment(path)
|
||||
self.assertTrue(seg2.startswith('rack_12_'))
|
||||
self.assertNotIn('/', seg2)
|
||||
self.assertNotIn('\\', seg2)
|
||||
self.assertTrue(seg2.endswith('.png'))
|
||||
|
||||
def test_prefix_format_is_as_expected(self):
|
||||
"""
|
||||
Tests the output path format generated by the `image_upload` function.
|
||||
"""
|
||||
inst = self._stub_instance(object_id=99, name='label')
|
||||
path = image_upload(inst, 'a.webp')
|
||||
# The second segment must begin with "rack_99_"
|
||||
seg2 = self._second_segment(path)
|
||||
self.assertTrue(seg2.startswith('rack_99_'))
|
||||
self.assertTrue(seg2.endswith('.webp'))
|
||||
|
||||
def test_unsupported_file_extension(self):
|
||||
"""
|
||||
Test that when the file extension is not allowed, the extension
|
||||
is omitted.
|
||||
"""
|
||||
inst = self._stub_instance(name='test')
|
||||
path = image_upload(inst, 'document.txt')
|
||||
|
||||
seg2 = self._second_segment(path)
|
||||
self.assertTrue(seg2.startswith('rack_12_test'))
|
||||
self.assertFalse(seg2.endswith('.txt'))
|
||||
# When not allowed, no extension should be appended
|
||||
self.assertNotRegex(seg2, r'\.txt$')
|
||||
|
||||
def test_instance_name_with_whitespace_and_special_chars(self):
|
||||
"""
|
||||
Test that an instance name with leading/trailing whitespace and
|
||||
special characters is sanitized properly.
|
||||
"""
|
||||
# Suppose the instance name has surrounding whitespace and
|
||||
# extra slashes.
|
||||
inst = self._stub_instance(name=' my/complex\\name ')
|
||||
path = image_upload(inst, 'irrelevant.png')
|
||||
|
||||
# The output should be flattened and sanitized.
|
||||
# We expect the name to be transformed into a valid filename without
|
||||
# path separators.
|
||||
seg2 = self._second_segment(path)
|
||||
self.assertNotIn(' ', seg2)
|
||||
self.assertNotIn('/', seg2)
|
||||
self.assertNotIn('\\', seg2)
|
||||
self.assertTrue(seg2.endswith('.png'))
|
||||
|
||||
def test_separator_variants_with_subTest(self):
|
||||
"""
|
||||
Tests that both forward slash and backslash in file paths are
|
||||
handled consistently by the `image_upload` function and
|
||||
processed into a sanitized uniform format.
|
||||
"""
|
||||
for name in ['2025/09/12', r'2025\09\12']:
|
||||
with self.subTest(name=name):
|
||||
inst = self._stub_instance(name=name)
|
||||
path = image_upload(inst, 'x.jpeg')
|
||||
seg2 = self._second_segment(path)
|
||||
self.assertTrue(seg2.startswith('rack_12_'))
|
||||
self.assertNotIn('/', seg2)
|
||||
self.assertNotIn('\\', seg2)
|
||||
self.assertTrue(seg2.endswith('.jpeg') or seg2.endswith('.jpg'))
|
||||
|
||||
def test_fallback_on_suspicious_file_operation(self):
|
||||
"""
|
||||
Test that when default_storage.get_valid_name() raises a
|
||||
SuspiciousFileOperation, the fallback default is used.
|
||||
"""
|
||||
inst = self._stub_instance(name=' ')
|
||||
path = image_upload(inst, 'sample.png')
|
||||
# Expect the fallback name 'unnamed' to be used.
|
||||
self.assertIn('unnamed', path)
|
||||
self.assertTrue(path.startswith('image-attachments/rack_12_'))
|
||||
|
||||
@@ -1,15 +1,20 @@
|
||||
import importlib
|
||||
from pathlib import Path
|
||||
|
||||
from django.core.exceptions import ImproperlyConfigured
|
||||
from django.core.exceptions import ImproperlyConfigured, SuspiciousFileOperation
|
||||
from django.core.files.storage import default_storage
|
||||
from django.core.files.utils import validate_file_name
|
||||
from django.db import models
|
||||
from django.db.models import Q
|
||||
from taggit.managers import _TaggableManager
|
||||
|
||||
from netbox.context import current_request
|
||||
|
||||
from .validators import CustomValidator
|
||||
|
||||
__all__ = (
|
||||
'SharedObjectViewMixin',
|
||||
'filename_from_model',
|
||||
'image_upload',
|
||||
'is_report',
|
||||
'is_script',
|
||||
@@ -35,13 +40,13 @@ class SharedObjectViewMixin:
|
||||
|
||||
|
||||
def filename_from_model(model: models.Model) -> str:
|
||||
"""Standardises how we generate filenames from model class for exports"""
|
||||
"""Standardizes how we generate filenames from model class for exports"""
|
||||
base = model._meta.verbose_name_plural.lower().replace(' ', '_')
|
||||
return f'netbox_{base}'
|
||||
|
||||
|
||||
def filename_from_object(context: dict) -> str:
|
||||
"""Standardises how we generate filenames from model class for exports"""
|
||||
"""Standardizes how we generate filenames from model class for exports"""
|
||||
if 'device' in context:
|
||||
base = f"{context['device'].name or 'config'}"
|
||||
elif 'virtualmachine' in context:
|
||||
@@ -64,17 +69,42 @@ def is_taggable(obj):
|
||||
def image_upload(instance, filename):
|
||||
"""
|
||||
Return a path for uploading image attachments.
|
||||
|
||||
- Normalizes browser paths (e.g., C:\\fake_path\\photo.jpg)
|
||||
- Uses the instance.name if provided (sanitized to a *basename*, no ext)
|
||||
- Prefixes with a machine-friendly identifier
|
||||
|
||||
Note: Relies on Django's default_storage utility.
|
||||
"""
|
||||
path = 'image-attachments/'
|
||||
upload_dir = 'image-attachments'
|
||||
default_filename = 'unnamed'
|
||||
allowed_img_extensions = ('bmp', 'gif', 'jpeg', 'jpg', 'png', 'webp')
|
||||
|
||||
# Rename the file to the provided name, if any. Attempt to preserve the file extension.
|
||||
extension = filename.rsplit('.')[-1].lower()
|
||||
if instance.name and extension in ['bmp', 'gif', 'jpeg', 'jpg', 'png', 'webp']:
|
||||
filename = '.'.join([instance.name, extension])
|
||||
elif instance.name:
|
||||
filename = instance.name
|
||||
# Normalize Windows paths and create a Path object.
|
||||
normalized_filename = str(filename).replace('\\', '/')
|
||||
file_path = Path(normalized_filename)
|
||||
|
||||
return '{}{}_{}_{}'.format(path, instance.object_type.name, instance.object_id, filename)
|
||||
# Extract the extension from the uploaded file.
|
||||
ext = file_path.suffix.lower().lstrip('.')
|
||||
|
||||
# Use the instance-provided name if available; otherwise use the file stem.
|
||||
# Rely on Django's get_valid_filename to perform sanitization.
|
||||
stem = (instance.name or file_path.stem).strip()
|
||||
try:
|
||||
safe_stem = default_storage.get_valid_name(stem)
|
||||
except SuspiciousFileOperation:
|
||||
safe_stem = default_filename
|
||||
|
||||
# Append the uploaded extension only if it's an allowed image type
|
||||
final_name = f"{safe_stem}.{ext}" if ext in allowed_img_extensions else safe_stem
|
||||
|
||||
# Create a machine-friendly prefix from the instance
|
||||
prefix = f"{instance.object_type.model}_{instance.object_id}"
|
||||
name_with_path = f"{upload_dir}/{prefix}_{final_name}"
|
||||
|
||||
# Validate the generated relative path (blocks absolute/traversal)
|
||||
validate_file_name(name_with_path, allow_relative_path=True)
|
||||
return name_with_path
|
||||
|
||||
|
||||
def is_script(obj):
|
||||
@@ -107,7 +137,7 @@ def run_validators(instance, validators):
|
||||
request = current_request.get()
|
||||
for validator in validators:
|
||||
|
||||
# Loading a validator class by dotted path
|
||||
# Loading a validator class by a dotted path
|
||||
if type(validator) is str:
|
||||
module, cls = validator.rsplit('.', 1)
|
||||
validator = getattr(importlib.import_module(module), cls)()
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
from datetime import datetime
|
||||
from django.contrib import messages
|
||||
from django.contrib.auth.mixins import LoginRequiredMixin
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
@@ -1547,7 +1548,6 @@ class ScriptResultView(TableMixin, generic.ObjectView):
|
||||
except KeyError:
|
||||
log_threshold = LOG_LEVEL_RANK[LogLevelChoices.LOG_INFO]
|
||||
if job.data:
|
||||
|
||||
if 'log' in job.data:
|
||||
if 'tests' in job.data:
|
||||
tests = job.data['tests']
|
||||
@@ -1558,7 +1558,7 @@ class ScriptResultView(TableMixin, generic.ObjectView):
|
||||
index += 1
|
||||
result = {
|
||||
'index': index,
|
||||
'time': log.get('time'),
|
||||
'time': datetime.fromisoformat(log.get('time')),
|
||||
'status': log.get('status'),
|
||||
'message': log.get('message'),
|
||||
'object': log.get('obj'),
|
||||
|
||||
@@ -804,6 +804,7 @@ class FHRPGroupFilterSet(NetBoxModelFilterSet):
|
||||
return queryset
|
||||
return queryset.filter(
|
||||
Q(description__icontains=value) |
|
||||
Q(group_id__contains=value) |
|
||||
Q(name__icontains=value)
|
||||
)
|
||||
|
||||
|
||||
@@ -580,13 +580,6 @@ class FHRPGroupAssignmentForm(forms.ModelForm):
|
||||
model = FHRPGroupAssignment
|
||||
fields = ('group', 'priority')
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
ipaddresses = self.instance.interface.ip_addresses.all()
|
||||
for ipaddress in ipaddresses:
|
||||
self.fields['group'].widget.add_query_param('related_ip', ipaddress.pk)
|
||||
|
||||
def clean_group(self):
|
||||
group = self.cleaned_data['group']
|
||||
|
||||
|
||||
@@ -164,7 +164,7 @@ def available_vlans_from_range(vlans, vlan_group, vid_range):
|
||||
prev_vid = vlan.vid
|
||||
|
||||
# Annotate any remaining available VLANs
|
||||
if prev_vid < max_vid:
|
||||
if prev_vid < max_vid - 1:
|
||||
new_vlans.append({
|
||||
'vid': prev_vid + 1,
|
||||
'vlan_group': vlan_group,
|
||||
|
||||
@@ -54,8 +54,26 @@ class VRFView(GetRelatedModelsMixin, generic.ObjectView):
|
||||
)
|
||||
export_targets_table.configure(request)
|
||||
|
||||
related_models = self.get_related_models(
|
||||
request,
|
||||
instance,
|
||||
omit=(Interface, VMInterface),
|
||||
extra=(
|
||||
(
|
||||
Interface.objects.restrict(request.user, 'view').filter(vrf=instance),
|
||||
'vrf_id',
|
||||
_('Device Interfaces')
|
||||
),
|
||||
(
|
||||
VMInterface.objects.restrict(request.user, 'view').filter(vrf=instance),
|
||||
'vrf_id',
|
||||
_('VM Interfaces')
|
||||
),
|
||||
),
|
||||
)
|
||||
|
||||
return {
|
||||
'related_models': self.get_related_models(request, instance, omit=[Interface, VMInterface]),
|
||||
'related_models': related_models,
|
||||
'import_targets_table': import_targets_table,
|
||||
'export_targets_table': export_targets_table,
|
||||
}
|
||||
|
||||
@@ -29,6 +29,13 @@ __all__ = (
|
||||
'OrganizationalModelFilterSet',
|
||||
)
|
||||
|
||||
STANDARD_LOOKUPS = (
|
||||
'exact',
|
||||
'iexact',
|
||||
'in',
|
||||
'contains',
|
||||
)
|
||||
|
||||
|
||||
#
|
||||
# FilterSets
|
||||
@@ -159,7 +166,7 @@ class BaseFilterSet(django_filters.FilterSet):
|
||||
return {}
|
||||
|
||||
# Skip nonstandard lookup expressions
|
||||
if existing_filter.method is not None or existing_filter.lookup_expr not in ['exact', 'iexact', 'in']:
|
||||
if existing_filter.method is not None or existing_filter.lookup_expr not in STANDARD_LOOKUPS:
|
||||
return {}
|
||||
|
||||
# Choose the lookup expression map based on the filter type
|
||||
|
||||
@@ -22,6 +22,7 @@ from netbox.models.deletion import DeleteMixin
|
||||
from netbox.plugins import PluginConfig
|
||||
from netbox.registry import registry
|
||||
from netbox.signals import post_clean
|
||||
from netbox.utils import register_model_feature
|
||||
from utilities.json import CustomFieldJSONEncoder
|
||||
from utilities.serialization import serialize_object
|
||||
|
||||
@@ -35,7 +36,6 @@ __all__ = (
|
||||
'CustomValidationMixin',
|
||||
'EventRulesMixin',
|
||||
'ExportTemplatesMixin',
|
||||
'FEATURES_MAP',
|
||||
'ImageAttachmentsMixin',
|
||||
'JobsMixin',
|
||||
'JournalingMixin',
|
||||
@@ -628,28 +628,21 @@ class SyncedDataMixin(models.Model):
|
||||
# Feature registration
|
||||
#
|
||||
|
||||
FEATURES_MAP = {
|
||||
'bookmarks': BookmarksMixin,
|
||||
'change_logging': ChangeLoggingMixin,
|
||||
'cloning': CloningMixin,
|
||||
'contacts': ContactsMixin,
|
||||
'custom_fields': CustomFieldsMixin,
|
||||
'custom_links': CustomLinksMixin,
|
||||
'custom_validation': CustomValidationMixin,
|
||||
'event_rules': EventRulesMixin,
|
||||
'export_templates': ExportTemplatesMixin,
|
||||
'image_attachments': ImageAttachmentsMixin,
|
||||
'jobs': JobsMixin,
|
||||
'journaling': JournalingMixin,
|
||||
'notifications': NotificationsMixin,
|
||||
'synced_data': SyncedDataMixin,
|
||||
'tags': TagsMixin,
|
||||
}
|
||||
|
||||
# TODO: Remove in NetBox v4.5
|
||||
registry['model_features'].update({
|
||||
feature: defaultdict(set) for feature in FEATURES_MAP.keys()
|
||||
})
|
||||
register_model_feature('bookmarks', lambda model: issubclass(model, BookmarksMixin))
|
||||
register_model_feature('change_logging', lambda model: issubclass(model, ChangeLoggingMixin))
|
||||
register_model_feature('cloning', lambda model: issubclass(model, CloningMixin))
|
||||
register_model_feature('contacts', lambda model: issubclass(model, ContactsMixin))
|
||||
register_model_feature('custom_fields', lambda model: issubclass(model, CustomFieldsMixin))
|
||||
register_model_feature('custom_links', lambda model: issubclass(model, CustomLinksMixin))
|
||||
register_model_feature('custom_validation', lambda model: issubclass(model, CustomValidationMixin))
|
||||
register_model_feature('event_rules', lambda model: issubclass(model, EventRulesMixin))
|
||||
register_model_feature('export_templates', lambda model: issubclass(model, ExportTemplatesMixin))
|
||||
register_model_feature('image_attachments', lambda model: issubclass(model, ImageAttachmentsMixin))
|
||||
register_model_feature('jobs', lambda model: issubclass(model, JobsMixin))
|
||||
register_model_feature('journaling', lambda model: issubclass(model, JournalingMixin))
|
||||
register_model_feature('notifications', lambda model: issubclass(model, NotificationsMixin))
|
||||
register_model_feature('synced_data', lambda model: issubclass(model, SyncedDataMixin))
|
||||
register_model_feature('tags', lambda model: issubclass(model, TagsMixin))
|
||||
|
||||
|
||||
def model_is_public(model):
|
||||
@@ -665,8 +658,11 @@ def model_is_public(model):
|
||||
|
||||
|
||||
def get_model_features(model):
|
||||
"""
|
||||
Return all features supported by the given model.
|
||||
"""
|
||||
return [
|
||||
feature for feature, cls in FEATURES_MAP.items() if issubclass(model, cls)
|
||||
feature for feature, test_func in registry['model_features'].items() if test_func(model)
|
||||
]
|
||||
|
||||
|
||||
@@ -710,19 +706,6 @@ def register_models(*models):
|
||||
if not getattr(model, '_netbox_private', False):
|
||||
registry['models'][app_label].add(model_name)
|
||||
|
||||
# TODO: Remove in NetBox v4.5
|
||||
# Record each applicable feature for the model in the registry
|
||||
features = {
|
||||
feature for feature, cls in FEATURES_MAP.items() if issubclass(model, cls)
|
||||
}
|
||||
for feature in features:
|
||||
try:
|
||||
registry['model_features'][feature][app_label].add(model_name)
|
||||
except KeyError:
|
||||
raise KeyError(
|
||||
f"{feature} is not a valid model feature! Valid keys are: {registry['model_features'].keys()}"
|
||||
)
|
||||
|
||||
# Register applicable feature views for the model
|
||||
if issubclass(model, ContactsMixin):
|
||||
register_model_view(model, 'contacts', kwargs={'model': model})(
|
||||
|
||||
@@ -84,6 +84,7 @@ CORS_ORIGIN_REGEX_WHITELIST = getattr(configuration, 'CORS_ORIGIN_REGEX_WHITELIS
|
||||
CORS_ORIGIN_WHITELIST = getattr(configuration, 'CORS_ORIGIN_WHITELIST', [])
|
||||
CSRF_COOKIE_NAME = getattr(configuration, 'CSRF_COOKIE_NAME', 'csrftoken')
|
||||
CSRF_COOKIE_PATH = f'/{BASE_PATH.rstrip("/")}'
|
||||
CSRF_COOKIE_HTTPONLY = True
|
||||
CSRF_COOKIE_SECURE = getattr(configuration, 'CSRF_COOKIE_SECURE', False)
|
||||
CSRF_TRUSTED_ORIGINS = getattr(configuration, 'CSRF_TRUSTED_ORIGINS', [])
|
||||
DATA_UPLOAD_MAX_MEMORY_SIZE = getattr(configuration, 'DATA_UPLOAD_MAX_MEMORY_SIZE', 2621440)
|
||||
|
||||
@@ -3,6 +3,7 @@ from netbox.registry import registry
|
||||
__all__ = (
|
||||
'get_data_backend_choices',
|
||||
'register_data_backend',
|
||||
'register_model_feature',
|
||||
'register_request_processor',
|
||||
)
|
||||
|
||||
@@ -27,6 +28,35 @@ def register_data_backend():
|
||||
return _wrapper
|
||||
|
||||
|
||||
def register_model_feature(name, func=None):
|
||||
"""
|
||||
Register a model feature with its qualifying function.
|
||||
|
||||
The qualifying function must accept a single `model` argument. It will be called to determine whether the given
|
||||
model supports the corresponding feature.
|
||||
|
||||
This function can be used directly:
|
||||
|
||||
register_model_feature('my_feature', my_func)
|
||||
|
||||
Or as a decorator:
|
||||
|
||||
@register_model_feature('my_feature')
|
||||
def my_func(model):
|
||||
...
|
||||
"""
|
||||
def decorator(f):
|
||||
registry['model_features'][name] = f
|
||||
return f
|
||||
|
||||
if name in registry['model_features']:
|
||||
raise ValueError(f"A model feature named {name} is already registered.")
|
||||
|
||||
if func is None:
|
||||
return decorator
|
||||
return decorator(func)
|
||||
|
||||
|
||||
def register_request_processor(func):
|
||||
"""
|
||||
Decorator for registering a request processor.
|
||||
|
||||
@@ -15,7 +15,7 @@ from django.utils.translation import gettext as _
|
||||
|
||||
from core.signals import clear_events
|
||||
from netbox.object_actions import (
|
||||
AddObject, BulkDelete, BulkEdit, BulkExport, BulkImport, CloneObject, DeleteObject, EditObject,
|
||||
BulkDelete, BulkEdit, BulkExport, BulkImport, CloneObject, DeleteObject, EditObject,
|
||||
)
|
||||
from utilities.error_handlers import handle_protectederror
|
||||
from utilities.exceptions import AbortRequest, PermissionsViolation
|
||||
@@ -103,7 +103,7 @@ class ObjectChildrenView(ObjectView, ActionsMixin, TableMixin):
|
||||
table = None
|
||||
filterset = None
|
||||
filterset_form = None
|
||||
actions = (AddObject, BulkImport, BulkEdit, BulkExport, BulkDelete)
|
||||
actions = (BulkImport, BulkEdit, BulkExport, BulkDelete)
|
||||
template_name = 'generic/object_children.html'
|
||||
|
||||
def get_children(self, request, parent):
|
||||
|
||||
@@ -20,7 +20,7 @@ from netbox.search.backends import search_backend
|
||||
from netbox.tables import SearchTable
|
||||
from utilities.htmx import htmx_partial
|
||||
from utilities.paginator import EnhancedPaginator, get_paginate_count
|
||||
from utilities.views import ConditionalLoginRequiredMixin
|
||||
from utilities.views import ConditionalLoginRequiredMixin, TokenConditionalLoginRequiredMixin
|
||||
|
||||
__all__ = (
|
||||
'HomeView',
|
||||
@@ -119,7 +119,7 @@ class SearchView(ConditionalLoginRequiredMixin, View):
|
||||
})
|
||||
|
||||
|
||||
class MediaView(ConditionalLoginRequiredMixin, View):
|
||||
class MediaView(TokenConditionalLoginRequiredMixin, View):
|
||||
"""
|
||||
Wrap Django's serve() view to enforce LOGIN_REQUIRED for static media.
|
||||
"""
|
||||
|
||||
2
netbox/project-static/dist/netbox.css
vendored
2
netbox/project-static/dist/netbox.css
vendored
File diff suppressed because one or more lines are too long
10
netbox/project-static/dist/netbox.js
vendored
10
netbox/project-static/dist/netbox.js
vendored
File diff suppressed because one or more lines are too long
6
netbox/project-static/dist/netbox.js.map
vendored
6
netbox/project-static/dist/netbox.js.map
vendored
File diff suppressed because one or more lines are too long
@@ -1 +1 @@
|
||||
svg{--nbx-rack-bg: var(--tblr-bg-surface-secondary);--nbx-rack-border: #000;--nbx-rack-slot-bg: #e9ecef;--nbx-rack-slot-border: #adb5bd;--nbx-rack-slot-hover-bg: #ced4da;--nbx-rack-link-color: #0d6efd;--nbx-rack-unit-color: #6c757d}svg[data-bs-theme=dark]{--nbx-rack-bg: rgb(27, 41, 58);--nbx-rack-border: #6c757d;--nbx-rack-slot-bg: #343a40;--nbx-rack-slot-border: #495057;--nbx-rack-slot-hover-bg: #212529;--nbx-rack-link-color: #9ec5fe;--nbx-rack-unit-color: #adb5bd}*{font-family:system-ui,-apple-system,Segoe UI,Roboto,Helvetica Neue,Noto Sans,Liberation Sans,Arial,sans-serif,"Apple Color Emoji","Segoe UI Emoji",Segoe UI Symbol,"Noto Color Emoji";font-size:.875rem}rect{box-sizing:border-box}text{text-anchor:middle;dominant-baseline:middle}svg{background-color:var(--nbx-rack-bg)}svg .unit{margin:0;padding:5px 0;fill:var(--nbx-rack-unit-color)}svg .hidden{visibility:hidden}svg rect.shaded,svg image.shaded{opacity:25%}svg text.shaded{opacity:50%}svg .rack{fill:none;stroke-width:2px;stroke:var(--nbx-rack-border)}svg .slot{fill:var(--nbx-rack-slot-bg);stroke:var(--nbx-rack-slot-border)}svg .slot:hover{fill:var(--nbx-rack-slot-hover-bg)}svg .slot+.add-device{fill:var(--nbx-rack-link-color);opacity:0;pointer-events:none}svg .slot:hover+.add-device{opacity:1}svg .slot.occupied[class],svg .slot.occupied:hover[class]{fill:url(#occupied)}svg .slot.blocked[class],svg .slot.blocked:hover[class]{fill:url(#blocked)}svg .slot.blocked:hover+.add-device{opacity:0}svg .reservation[class]{fill:url(#reserved)}
|
||||
svg{--nbx-rack-bg: var(--tblr-bg-surface-secondary);--nbx-rack-border: #000;--nbx-rack-slot-bg: #e9ecef;--nbx-rack-slot-border: #adb5bd;--nbx-rack-slot-hover-bg: #ced4da;--nbx-rack-link-color: #0d6efd;--nbx-rack-unit-color: #6c757d}svg[data-bs-theme=dark]{--nbx-rack-bg: rgb(27, 41, 58);--nbx-rack-border: #6c757d;--nbx-rack-slot-bg: #343a40;--nbx-rack-slot-border: #495057;--nbx-rack-slot-hover-bg: #212529;--nbx-rack-link-color: #9ec5fe;--nbx-rack-unit-color: #adb5bd}rect{box-sizing:border-box}text{text-anchor:middle;dominant-baseline:middle}svg{background-color:var(--nbx-rack-bg);font-family:system-ui,-apple-system,Segoe UI,Roboto,Helvetica Neue,Noto Sans,Liberation Sans,Arial,sans-serif,"Apple Color Emoji","Segoe UI Emoji",Segoe UI Symbol,"Noto Color Emoji";font-size:.875rem}svg .unit{margin:0;padding:5px 0;fill:var(--nbx-rack-unit-color)}svg .hidden{visibility:hidden}svg rect.shaded,svg image.shaded{opacity:25%}svg text.shaded{opacity:50%}svg .rack{fill:none;stroke-width:2px;stroke:var(--nbx-rack-border)}svg .slot{fill:var(--nbx-rack-slot-bg);stroke:var(--nbx-rack-slot-border)}svg .slot:hover{fill:var(--nbx-rack-slot-hover-bg)}svg .slot+.add-device{fill:var(--nbx-rack-link-color);opacity:0;pointer-events:none}svg .slot:hover+.add-device{opacity:1}svg .slot.occupied[class],svg .slot.occupied:hover[class]{fill:url(#occupied)}svg .slot.blocked[class],svg .slot.blocked:hover[class]{fill:url(#blocked)}svg .slot.blocked:hover+.add-device{opacity:0}svg .reservation[class]{fill:url(#reserved)}
|
||||
|
||||
1
netbox/project-static/img/plugin-default.svg
Normal file
1
netbox/project-static/img/plugin-default.svg
Normal file
@@ -0,0 +1 @@
|
||||
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="icon icon-tabler icons-tabler-outline icon-tabler-box"><path stroke="none" d="M0 0h24v24H0z" fill="none"/><path d="M12 3l8 4.5l0 9l-8 4.5l-8 -4.5l0 -9l8 -4.5" /><path d="M12 12l8 -4.5" /><path d="M12 12l0 9" /><path d="M12 12l-8 -4.5" /></svg>
|
||||
|
After Width: | Height: | Size: 441 B |
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "netbox",
|
||||
"version": "4.3.0",
|
||||
"version": "4.4.0",
|
||||
"main": "dist/netbox.js",
|
||||
"license": "Apache-2.0",
|
||||
"private": true,
|
||||
@@ -24,13 +24,13 @@
|
||||
"dependencies": {
|
||||
"@mdi/font": "7.4.47",
|
||||
"@tabler/core": "1.4.0",
|
||||
"bootstrap": "5.3.7",
|
||||
"bootstrap": "5.3.8",
|
||||
"clipboard": "2.0.11",
|
||||
"flatpickr": "4.6.13",
|
||||
"gridstack": "12.3.3",
|
||||
"htmx.org": "2.0.6",
|
||||
"query-string": "9.2.2",
|
||||
"sass": "1.90.0",
|
||||
"htmx.org": "2.0.7",
|
||||
"query-string": "9.3.0",
|
||||
"sass": "1.92.1",
|
||||
"tom-select": "2.4.3",
|
||||
"typeface-inter": "3.18.1",
|
||||
"typeface-roboto-mono": "1.1.13"
|
||||
|
||||
@@ -12,11 +12,13 @@ pre.change-data {
|
||||
min-width: fit-content;
|
||||
|
||||
&.added {
|
||||
background-color: $green;
|
||||
color: var(--tblr-dark);
|
||||
background-color: $green-300;
|
||||
}
|
||||
|
||||
&.removed {
|
||||
background-color: $red;
|
||||
color: var(--tblr-dark);
|
||||
background-color: $red-300;
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -26,11 +28,13 @@ pre.change-diff {
|
||||
border-color: transparent;
|
||||
|
||||
&.change-added {
|
||||
background-color: $green;
|
||||
color: var(--tblr-dark);
|
||||
background-color: $green-300;
|
||||
}
|
||||
|
||||
&.change-removed {
|
||||
background-color: $red;
|
||||
color: var(--tblr-dark);
|
||||
background-color: $red-300;
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -30,7 +30,7 @@
|
||||
|
||||
// Remove the bottom margin of the last <p> elements in markdown
|
||||
.rendered-markdown {
|
||||
p:last-of-type {
|
||||
p:last-child {
|
||||
margin-bottom: 0;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -6,3 +6,9 @@
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Replicate styling of tom-selected <select> fields tagged with .is-invalid to
|
||||
// their corresponding TomSelect dropdowns
|
||||
select.tomselected.is-invalid + div.ts-wrapper {
|
||||
@extend .is-invalid;
|
||||
}
|
||||
|
||||
@@ -28,10 +28,6 @@ svg {
|
||||
}
|
||||
}
|
||||
|
||||
* {
|
||||
font-family: $font-family-sans-serif;
|
||||
font-size: $font-size-sm;
|
||||
}
|
||||
rect {
|
||||
box-sizing: border-box;
|
||||
}
|
||||
@@ -42,6 +38,8 @@ text {
|
||||
|
||||
svg {
|
||||
background-color: var(--nbx-rack-bg);
|
||||
font-family: $font-family-sans-serif;
|
||||
font-size: $font-size-sm;
|
||||
|
||||
// Rack unit numbers along left side of rack elevation.
|
||||
.unit {
|
||||
|
||||
@@ -1116,6 +1116,11 @@ bootstrap@5.3.7:
|
||||
resolved "https://registry.yarnpkg.com/bootstrap/-/bootstrap-5.3.7.tgz#8640065036124d961d885d80b5945745e1154d90"
|
||||
integrity sha512-7KgiD8UHjfcPBHEpDNg+zGz8L3LqR3GVwqZiBRFX04a1BCArZOz1r2kjly2HQ0WokqTO0v1nF+QAt8dsW4lKlw==
|
||||
|
||||
bootstrap@5.3.8:
|
||||
version "5.3.8"
|
||||
resolved "https://registry.yarnpkg.com/bootstrap/-/bootstrap-5.3.8.tgz#6401a10057a22752d21f4e19055508980656aeed"
|
||||
integrity sha512-HP1SZDqaLDPwsNiqRqi5NcP0SSXciX2s9E+RyqJIIqGo+vJeN5AJVM98CXmW/Wux0nQ5L7jeWUdplCEf0Ee+tg==
|
||||
|
||||
brace-expansion@^1.1.7:
|
||||
version "1.1.11"
|
||||
resolved "https://registry.yarnpkg.com/brace-expansion/-/brace-expansion-1.1.11.tgz#3c7fcbf529d87226f3d2f52b966ff5271eb441dd"
|
||||
@@ -2236,10 +2241,10 @@ hey-listen@^1.0.8:
|
||||
resolved "https://registry.yarnpkg.com/hey-listen/-/hey-listen-1.0.8.tgz#8e59561ff724908de1aa924ed6ecc84a56a9aa68"
|
||||
integrity sha512-COpmrF2NOg4TBWUJ5UVyaCU2A88wEMkUPK4hNqyCkqHbxT92BbvfjoSozkAIIm6XhicGlJHhFdullInrdhwU8Q==
|
||||
|
||||
htmx.org@2.0.6:
|
||||
version "2.0.6"
|
||||
resolved "https://registry.yarnpkg.com/htmx.org/-/htmx.org-2.0.6.tgz#42573483c72112e7e332dfe93043cd0eb32cda01"
|
||||
integrity sha512-7ythjYneGSk3yCHgtCnQeaoF+D+o7U2LF37WU3O0JYv3gTZSicdEFiI/Ai/NJyC5ZpYJWMpUb11OC5Lr6AfAqA==
|
||||
htmx.org@2.0.7:
|
||||
version "2.0.7"
|
||||
resolved "https://registry.yarnpkg.com/htmx.org/-/htmx.org-2.0.7.tgz#991571e009a2ea4cb60e7af8bb4c1c8c0de32ecd"
|
||||
integrity sha512-YiJqF3U5KyO28VC5mPfehKJPF+n1Gni+cupK+D69TF0nm7wY6AXn3a4mPWIikfAXtl1u1F1+ZhSCS7KT8pVmqA==
|
||||
|
||||
ignore@^5.2.0:
|
||||
version "5.3.2"
|
||||
@@ -2985,10 +2990,10 @@ punycode@^2.1.0:
|
||||
resolved "https://registry.yarnpkg.com/punycode/-/punycode-2.3.1.tgz#027422e2faec0b25e1549c3e1bd8309b9133b6e5"
|
||||
integrity sha512-vYt7UD1U9Wg6138shLtLOvdAu+8DsC/ilFtEVHcH+wydcSpNE20AfSOduf6MkRFahL5FY7X1oU7nKVZFtfq8Fg==
|
||||
|
||||
query-string@9.2.2:
|
||||
version "9.2.2"
|
||||
resolved "https://registry.yarnpkg.com/query-string/-/query-string-9.2.2.tgz#a0104824edfdd2c1db2f18af71cef7abf6a3b20f"
|
||||
integrity sha512-pDSIZJ9sFuOp6VnD+5IkakSVf+rICAuuU88Hcsr6AKL0QtxSIfVuKiVP2oahFI7tk3CRSexwV+Ya6MOoTxzg9g==
|
||||
query-string@9.3.0:
|
||||
version "9.3.0"
|
||||
resolved "https://registry.yarnpkg.com/query-string/-/query-string-9.3.0.tgz#f2d60d6b4442cb445f374b5ff749b937b2cccd03"
|
||||
integrity sha512-IQHOQ9aauHAApwAaUYifpEyLHv6fpVGVkMOnwPzcDScLjbLj8tLsILn6unSW79NafOw1llh8oK7Gd0VwmXBFmA==
|
||||
dependencies:
|
||||
decode-uri-component "^0.4.1"
|
||||
filter-obj "^5.1.0"
|
||||
@@ -3185,10 +3190,10 @@ safe-regex-test@^1.1.0:
|
||||
es-errors "^1.3.0"
|
||||
is-regex "^1.2.1"
|
||||
|
||||
sass@1.90.0:
|
||||
version "1.90.0"
|
||||
resolved "https://registry.yarnpkg.com/sass/-/sass-1.90.0.tgz#d6fc2be49c7c086ce86ea0b231a35bf9e33cb84b"
|
||||
integrity sha512-9GUyuksjw70uNpb1MTYWsH9MQHOHY6kwfnkafC24+7aOMZn9+rVMBxRbLvw756mrBFbIsFg6Xw9IkR2Fnn3k+Q==
|
||||
sass@1.92.1:
|
||||
version "1.92.1"
|
||||
resolved "https://registry.yarnpkg.com/sass/-/sass-1.92.1.tgz#07fb1fec5647d7b712685d1090628bf52456fe86"
|
||||
integrity sha512-ffmsdbwqb3XeyR8jJR6KelIXARM9bFQe8A6Q3W4Klmwy5Ckd5gz7jgUNHo4UOqutU5Sk1DtKLbpDP0nLCg1xqQ==
|
||||
dependencies:
|
||||
chokidar "^4.0.0"
|
||||
immutable "^5.0.2"
|
||||
|
||||
@@ -1,4 +1,3 @@
|
||||
version: "4.4.0"
|
||||
version: "4.4.1"
|
||||
edition: "Community"
|
||||
published: "2025-08-15"
|
||||
designation: "beta1"
|
||||
published: "2025-09-16"
|
||||
|
||||
@@ -60,7 +60,7 @@
|
||||
<td>{{ worker.pid|placeholder }}</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<th scope="row">{% trans "Curent Job" %}</th>
|
||||
<th scope="row">{% trans "Current Job" %}</th>
|
||||
<td>{{ job.func_name|placeholder }}</td>
|
||||
</tr>
|
||||
<tr>
|
||||
|
||||
@@ -117,4 +117,9 @@
|
||||
{% render_field form.comments %}
|
||||
</div>
|
||||
|
||||
{# Meta fields #}
|
||||
<div class="bg-primary-subtle border border-primary rounded-1 pt-3 px-3 mb-3">
|
||||
{% render_field form.changelog_message %}
|
||||
</div>
|
||||
|
||||
{% endblock %}
|
||||
|
||||
@@ -87,9 +87,12 @@
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
{% if form.comments %}
|
||||
<div class="field-group mb-5">
|
||||
<h2 class="text-center">{% trans "Comments" %}</h2>
|
||||
{% render_field form.comments %}
|
||||
</div>
|
||||
{% endif %}
|
||||
<div class="field-group mb-5">
|
||||
<h2 class="text-center">{% trans "Comments" %}</h2>
|
||||
{% render_field form.comments %}
|
||||
</div>
|
||||
|
||||
{# Meta fields #}
|
||||
<div class="bg-primary-subtle border border-primary rounded-1 pt-3 px-3 mb-3">
|
||||
{% render_field form.changelog_message %}
|
||||
</div>
|
||||
|
||||
@@ -27,6 +27,16 @@
|
||||
alt="{{ object.description|default:object.name }}"
|
||||
/>
|
||||
</a>
|
||||
{% empty %}
|
||||
<a href="{{ object.get_absolute_url }}" class="d-block text-decoration-none" title="{{ object.name }}">
|
||||
<div class="d-flex align-items-center justify-content-center rounded bg-light text-secondary border" style="width: 200px; height: 200px;">
|
||||
<div class="text-center">
|
||||
<i class="mdi mdi-image-broken-variant display-4"></i>
|
||||
<div class="small mt-2 text-dark">{% trans "Thumbnail cannot be generated" %}</div>
|
||||
<div class="small fw-bold text-dark">{% trans "Click to view original" %}</div>
|
||||
</div>
|
||||
</div>
|
||||
</a>
|
||||
{% endthumbnail %}
|
||||
<div class="text-center text-secondary text-truncate fs-5">
|
||||
{{ object }}
|
||||
|
||||
@@ -1,16 +1,14 @@
|
||||
{% load buttons %}
|
||||
{% load i18n %}
|
||||
|
||||
<div class="alert alert-warning" role="alert">
|
||||
<div class="d-flex justify-content-between">
|
||||
<div>
|
||||
<i class="mdi mdi-alert p-2"></i>
|
||||
{% blocktrans trimmed with model=model|meta:"verbose_name" prerequisite_model=prerequisite_model|meta:"verbose_name" %}
|
||||
Before you can add a {{ model }} you must first create a <strong>{{ prerequisite_model }}</strong>.
|
||||
{% endblocktrans %}
|
||||
</div>
|
||||
<div>
|
||||
{% add_button prerequisite_model %}
|
||||
</div>
|
||||
</div>
|
||||
<div class="alert alert-warning d-flex align-items-center" role="alert">
|
||||
<span class="text-warning fs-1">
|
||||
<i class="mdi mdi-alert"></i>
|
||||
</span>
|
||||
<span class="flex-fill">
|
||||
{% blocktrans trimmed with model=model|meta:"verbose_name" prerequisite_model=prerequisite_model|meta:"verbose_name" %}
|
||||
Before you can add a {{ model }} you must first create a <strong>{{ prerequisite_model }}</strong>.
|
||||
{% endblocktrans %}
|
||||
</span>
|
||||
{% add_button prerequisite_model return_url=request.path %}
|
||||
</div>
|
||||
|
||||
@@ -4,12 +4,12 @@
|
||||
<div class="card">
|
||||
<h2 class="card-header">{% trans "Related Objects" %}</h2>
|
||||
<ul class="list-group list-group-flush" role="presentation">
|
||||
{% for qs, filter_param in related_models %}
|
||||
{% action_url qs.model 'list' as list_url %}
|
||||
{% for related_object_count in related_models %}
|
||||
{% action_url related_object_count.queryset.model 'list' as list_url %}
|
||||
{% if list_url %}
|
||||
<a href="{{ list_url }}?{{ filter_param }}={{ object.pk }}" class="list-group-item list-group-item-action d-flex justify-content-between">
|
||||
{{ qs.model|meta:"verbose_name_plural"|bettertitle }}
|
||||
{% with count=qs.count %}
|
||||
<a href="{{ list_url }}?{{ related_object_count.filter_param }}={{ object.pk }}" class="list-group-item list-group-item-action d-flex justify-content-between">
|
||||
{{ related_object_count.name }}
|
||||
{% with count=related_object_count.queryset.count %}
|
||||
{% if count %}
|
||||
<span class="badge text-bg-primary rounded-pill">{{ count }}</span>
|
||||
{% else %}
|
||||
|
||||
@@ -77,4 +77,9 @@
|
||||
<div class="field-group my-5">
|
||||
{% render_field form.comments %}
|
||||
</div>
|
||||
|
||||
{# Meta fields #}
|
||||
<div class="bg-primary-subtle border border-primary rounded-1 pt-3 px-3 mb-3">
|
||||
{% render_field form.changelog_message %}
|
||||
</div>
|
||||
{% endblock %}
|
||||
|
||||
@@ -69,7 +69,7 @@ class ContactGroupBulkEditForm(NetBoxModelBulkEditForm):
|
||||
required=False
|
||||
)
|
||||
description = forms.CharField(
|
||||
label=_('Desciption'),
|
||||
label=_('Description'),
|
||||
max_length=200,
|
||||
required=False
|
||||
)
|
||||
|
||||
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user