[Bug] Exessive logging #843

Closed
opened 2025-12-29 02:24:43 +01:00 by adam · 6 comments
Owner

Originally created by @pittbull on GitHub (Oct 24, 2024).

Is this a support request?

  • This is not a support request

Is there an existing issue for this?

  • I have searched the existing issues

Current Behavior

Headscale successfully up and running behind Nginx Proxy Manager. Logs in NPM suggests a problem reaching DERP:

[24/Oct/2024:13:54:05 +0200] - 404 404 - GET https <URL> "/derp/latency-check" [Client 178.232.163.94] [Length 0] [Gzip -] [Sent-to unraid.home] "Go-http-client/1.1" "-"
[24/Oct/2024:13:54:05 +0200] - 404 404 - GET https <URL> "/derp/latency-check" [Client 178.232.163.94] [Length 0] [Gzip -] [Sent-to unraid.home] "Go-http-client/1.1" "-"
[24/Oct/2024:13:54:16 +0200] - 404 404 - GET https <URL> "/derp/latency-check" [Client 178.232.163.94] [Length 0] [Gzip -] [Sent-to unraid.home] "Go-http-client/1.1" "-"
[24/Oct/2024:13:54:16 +0200] - 404 404 - GET https <URL> "/derp/latency-check" [Client 178.232.163.94] [Length 0] [Gzip -] [Sent-to unraid.home] "Go-http-client/1.1" "-"
[24/Oct/2024:13:54:16 +0200] - 404 404 - GET https <URL> "/derp/latency-check" [Client 192.168.1.19] [Length 0] [Gzip -] [Sent-to unraid.home] "Go-http-client/1.1" "-"
[24/Oct/2024:13:54:16 +0200] - 404 404 - GET https <URL> "/derp/latency-check" [Client 192.168.1.19] [Length 0] [Gzip -] [Sent-to unraid.home] "Go-http-client/1.1" "-"

This is logged every other second.

Expected Behavior

If this indicates a problem I don't know how to solve it. If not could this logging be disabled?

Steps To Reproduce

Start headscale as normal.

Environment

- OS: Unraid
- Headscale version: 0.23.0
- Tailscale version: 1.76.1

Runtime environment

  • Headscale is behind a (reverse) proxy
  • Headscale runs in a container

Anything else?

No response

Originally created by @pittbull on GitHub (Oct 24, 2024). ### Is this a support request? - [X] This is not a support request ### Is there an existing issue for this? - [X] I have searched the existing issues ### Current Behavior Headscale successfully up and running behind Nginx Proxy Manager. Logs in NPM suggests a problem reaching DERP: ``` [24/Oct/2024:13:54:05 +0200] - 404 404 - GET https <URL> "/derp/latency-check" [Client 178.232.163.94] [Length 0] [Gzip -] [Sent-to unraid.home] "Go-http-client/1.1" "-" [24/Oct/2024:13:54:05 +0200] - 404 404 - GET https <URL> "/derp/latency-check" [Client 178.232.163.94] [Length 0] [Gzip -] [Sent-to unraid.home] "Go-http-client/1.1" "-" [24/Oct/2024:13:54:16 +0200] - 404 404 - GET https <URL> "/derp/latency-check" [Client 178.232.163.94] [Length 0] [Gzip -] [Sent-to unraid.home] "Go-http-client/1.1" "-" [24/Oct/2024:13:54:16 +0200] - 404 404 - GET https <URL> "/derp/latency-check" [Client 178.232.163.94] [Length 0] [Gzip -] [Sent-to unraid.home] "Go-http-client/1.1" "-" [24/Oct/2024:13:54:16 +0200] - 404 404 - GET https <URL> "/derp/latency-check" [Client 192.168.1.19] [Length 0] [Gzip -] [Sent-to unraid.home] "Go-http-client/1.1" "-" [24/Oct/2024:13:54:16 +0200] - 404 404 - GET https <URL> "/derp/latency-check" [Client 192.168.1.19] [Length 0] [Gzip -] [Sent-to unraid.home] "Go-http-client/1.1" "-" ``` This is logged every other second. ### Expected Behavior If this indicates a problem I don't know how to solve it. If not could this logging be disabled? ### Steps To Reproduce Start headscale as normal. ### Environment ```markdown - OS: Unraid - Headscale version: 0.23.0 - Tailscale version: 1.76.1 ``` ### Runtime environment - [X] Headscale is behind a (reverse) proxy - [X] Headscale runs in a container ### Anything else? _No response_
adam added the bug label 2025-12-29 02:24:43 +01:00
adam closed this issue 2025-12-29 02:24:43 +01:00
Author
Owner

@pittbull commented on GitHub (Oct 25, 2024):

I temporarily reduced the logging by adding a custom location in nginx, and disabling access logging.

As I use these logfiles to analyze access, this events skewed the numbers.

image
@pittbull commented on GitHub (Oct 25, 2024): I temporarily reduced the logging by adding a custom location in nginx, and disabling access logging. As I use these logfiles to analyze access, this events skewed the numbers. <img width="491" alt="image" src="https://github.com/user-attachments/assets/005cc279-edf6-44e3-8b1d-cce289c8bd46">
Author
Owner

@nblock commented on GitHub (Oct 29, 2024):

According to 15fc6cd966 the routes /derp/probe and /derp/latency-check are the same and different versions of the tailscale client use one or the other endpoint.

Headscale currently only supports /derp/probe (see: https://github.com/juanfont/headscale/blob/main/hscontrol/app.go#L462). We should probably handle /derp/latency-check with the same handler.

@nblock commented on GitHub (Oct 29, 2024): According to https://github.com/tailscale/tailscale/commit/15fc6cd96637e8a0e697ff2157c1608ada8e4a39 the routes `/derp/probe` and `/derp/latency-check` are the same and different versions of the tailscale client use one or the other endpoint. Headscale currently only supports `/derp/probe` (see: https://github.com/juanfont/headscale/blob/main/hscontrol/app.go#L462). We should probably handle `/derp/latency-check` with the same handler.
Author
Owner

@nblock commented on GitHub (Oct 29, 2024):

Since you are using Nginx Proxy Manager, you might as well add an internal redirect on the exact path /derp/latency-check until a fix is available in headscale. Something like:

location = /derp/latency-check {
  rewrite ^/derp/latency-check$ /derp/probe last;
}
@nblock commented on GitHub (Oct 29, 2024): Since you are using Nginx Proxy Manager, you might as well add an internal redirect on the exact path `/derp/latency-check` until a fix is available in headscale. Something like: ``` location = /derp/latency-check { rewrite ^/derp/latency-check$ /derp/probe last; } ```
Author
Owner

@pittbull commented on GitHub (Oct 29, 2024):

I had to implement your suggestion like this:
image

This results in going from a 101 message to a 200.

@pittbull commented on GitHub (Oct 29, 2024): I had to implement your suggestion like this: <img width="488" alt="image" src="https://github.com/user-attachments/assets/b89d273a-fc6d-4836-9d5e-58f330f85150"> This results in going from a 101 message to a 200.
Author
Owner

@nblock commented on GitHub (Oct 29, 2024):

Right, I was actually on track to workaround http/404 and don't care about the logging as you wrote.

Headscale itself does not log requests to /derp/probe or /derp/latency-check, so the excessive logging is only due to the reverse proxy and should be handled there.

@nblock commented on GitHub (Oct 29, 2024): Right, I was actually on track to workaround http/404 and don't care about the logging as you wrote. Headscale itself does not log requests to `/derp/probe` or `/derp/latency-check`, so the excessive logging is only due to the reverse proxy and should be handled there.
Author
Owner

@pittbull commented on GitHub (Oct 30, 2024):

I see. Thanks for the feedback.

@pittbull commented on GitHub (Oct 30, 2024): I see. Thanks for the feedback.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: starred/headscale#843