[Bug] systemctl stop headscale is very slow! #723

Closed
opened 2025-12-29 02:22:50 +01:00 by adam · 4 comments
Owner

Originally created by @1298391314 on GitHub (Jun 5, 2024).

Is this a support request?

  • This is not a support request

Is there an existing issue for this?

  • I have searched the existing issues

Current Behavior

systemctl stop headscale will takes a long time to stop every time!

By debugging, I found that blocking at [h.pollNetMapStreamWG.Wait()] (https://github.com/juanfont/headscale/blob/main/hscontrol/app.go#L816).

Expected Behavior

to be resolved

Steps To Reproduce

1.make a node online
2.systemctl stop headscale

Environment

- OS:Ubuntu 20.04
- Headscale version: headscale_0.22.3_linux_amd64
- Tailscale version: 1.66

Runtime environment

  • Headscale is behind a (reverse) proxy
  • Headscale runs in a container

Anything else?

No response

Originally created by @1298391314 on GitHub (Jun 5, 2024). ### Is this a support request? - [X] This is not a support request ### Is there an existing issue for this? - [X] I have searched the existing issues ### Current Behavior systemctl stop headscale will takes a long time to stop every time! By debugging, I found that blocking at [h.pollNetMapStreamWG.Wait()] (https://github.com/juanfont/headscale/blob/main/hscontrol/app.go#L816). ### Expected Behavior to be resolved ### Steps To Reproduce 1.make a node online 2.systemctl stop headscale ### Environment ```markdown - OS:Ubuntu 20.04 - Headscale version: headscale_0.22.3_linux_amd64 - Tailscale version: 1.66 ``` ### Runtime environment - [ ] Headscale is behind a (reverse) proxy - [ ] Headscale runs in a container ### Anything else? _No response_
adam added the bug label 2025-12-29 02:22:50 +01:00
adam closed this issue 2025-12-29 02:22:50 +01:00
Author
Owner

@kradalby commented on GitHub (Jun 5, 2024):

please try the new alpha, the 0.22.3 wont receive any other fixes.

@kradalby commented on GitHub (Jun 5, 2024): please try the new alpha, the 0.22.3 wont receive any other fixes.
Author
Owner

@1298391314 commented on GitHub (Jun 6, 2024):

please try the new alpha, the 0.22.3 wont receive any other fixes.

I compiled headscale myself using the latest code, and the problem still exists.

@1298391314 commented on GitHub (Jun 6, 2024): > please try the new alpha, the 0.22.3 wont receive any other fixes. I compiled headscale myself using the latest code, and the problem still exists.
Author
Owner

@bottiger1 commented on GitHub (Jun 21, 2024):

I am also getting this problem with v0.23.0-alpha12

@bottiger1 commented on GitHub (Jun 21, 2024): I am also getting this problem with [v0.23.0-alpha12](https://github.com/juanfont/headscale/releases/tag/v0.23.0-alpha12)
Author
Owner

@nblock commented on GitHub (Sep 7, 2024):

Problem also exists on 0.23.0-beta3 and on 8a3a0fee3c.

I could also reproduce this without systemd involved:

  • Start headscale
  • Connect a node
  • Hit ctrl-c
    INF github.com/juanfont/headscale/hscontrol/poll.go:698 > node has connected, mapSession: 0xc000202f00, chan:   0xc000496460 node=localhost node.id=1 omitPeers=false readOnly=false stream=true
    ^C
    INF Received signal to stop, shutting down gracefully signal=interrupt
    
  • Hangs until killed with: kill -9 $(pidof headscale)
@nblock commented on GitHub (Sep 7, 2024): Problem also exists on 0.23.0-beta3 and on 8a3a0fee3ccbca7dd67b0d2965b523c8b6cb5451. I could also reproduce this without systemd involved: * Start headscale * Connect a node * Hit ctrl-c ``` INF github.com/juanfont/headscale/hscontrol/poll.go:698 > node has connected, mapSession: 0xc000202f00, chan: 0xc000496460 node=localhost node.id=1 omitPeers=false readOnly=false stream=true ^C INF Received signal to stop, shutting down gracefully signal=interrupt ``` * Hangs until killed with: `kill -9 $(pidof headscale)`
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: starred/headscale#723