Some Permission denied by ACL when same tags on a couple of clients #673

Closed
opened 2025-12-29 02:21:53 +01:00 by adam · 15 comments
Owner

Originally created by @masterwishx on GitHub (Mar 19, 2024).

in Docker by Nginx Proxy Manager on Oracle VPS in Ubuntu
using lasted version v0.23.0-alpha5

For webUI https://github.com/goodieshq/headscale-admin

When using same tag for couple of client ,then some clients not working as in ACLs . i tryed soem other tag but the same behavior :

tgas to client added from

  1. vps - tag:cloud-server
  2. vps - tag:cloud-server
  3. vps - tag:cloud-server
  4. unraid server - tag:home-server
  5. win11 - tag:home-pc
  6. vm - tag:home-server

i run tailscale ssh enabled on 3 vps , but vps 3 cant ssh to vps 2, but can to vps 1 other vps working fine .
also from unraid cant ssh to all vps ,only after changed tag on vm to test so unraid have unique tag then its working ...

Befor used other tags - like cloud,server ...
Tags added by headscale-admin

my ACL for now :

{
  "groups": {
    "group:admin": ["user1"]
  },

  "tagOwners": {
    "tag:cloud-server:*": ["group:admin"],
    "tag:home-pc:*": ["group:admin"],
    "tag:home-server:*": ["group:admin"],
    "tag:home-vm:*": ["group:admin"],
    "tag:mobile:*": ["group:admin"]
  },

  "acls": [
    {
      "action": "accept",
      "src": ["group:admin"],
      "dst": ["*:*"]
    }
  ],

  "ssh": [
    {
      "action": "accept",
      "src": ["tag:cloud-server", "tag:home-server", "tag:home-pc"],
      "dst": ["tag:cloud-server"],
      "users": ["root", "ubuntu"]
    }
  ]
}

Originally created by @masterwishx on GitHub (Mar 19, 2024). in Docker by Nginx Proxy Manager on Oracle VPS in Ubuntu using lasted version [v0.23.0-alpha5](https://github.com/juanfont/headscale/releases/tag/v0.23.0-alpha5) For webUI https://github.com/goodieshq/headscale-admin When using same tag for couple of client ,then some clients not working as in ACLs . i tryed soem other tag but the same behavior : tgas to client added from 1. vps - tag:cloud-server 2. vps - tag:cloud-server 3. vps - tag:cloud-server 4. unraid server - tag:home-server 5. win11 - tag:home-pc 6. vm - tag:home-server i run tailscale ssh enabled on 3 vps , but vps 3 cant ssh to vps 2, but can to vps 1 other vps working fine . also from unraid cant ssh to all vps ,only after changed tag on vm to `test` so unraid have unique tag then its working ... Befor used other tags - like cloud,server ... Tags added by headscale-admin my ACL for now : ``` { "groups": { "group:admin": ["user1"] }, "tagOwners": { "tag:cloud-server:*": ["group:admin"], "tag:home-pc:*": ["group:admin"], "tag:home-server:*": ["group:admin"], "tag:home-vm:*": ["group:admin"], "tag:mobile:*": ["group:admin"] }, "acls": [ { "action": "accept", "src": ["group:admin"], "dst": ["*:*"] } ], "ssh": [ { "action": "accept", "src": ["tag:cloud-server", "tag:home-server", "tag:home-pc"], "dst": ["tag:cloud-server"], "users": ["root", "ubuntu"] } ] } ```
adam added the bugno-stale-botpolicy 📝tags labels 2025-12-29 02:21:53 +01:00
adam closed this issue 2025-12-29 02:21:53 +01:00
Author
Owner

@masterwishx commented on GitHub (Mar 20, 2024):

from debug on client 3 : (no client 3 in list )

image

image

@masterwishx commented on GitHub (Mar 20, 2024): from debug on client 3 : (no client 3 in list ) ![image](https://github.com/juanfont/headscale/assets/28630321/5f174a2f-da97-4395-88c5-f60e97a6103d) ![image](https://github.com/juanfont/headscale/assets/28630321/6c87300d-ef23-4831-9941-ce9c7fbd0f8a)
Author
Owner

@github-actions[bot] commented on GitHub (Aug 7, 2024):

This issue is stale because it has been open for 90 days with no activity.

@github-actions[bot] commented on GitHub (Aug 7, 2024): This issue is stale because it has been open for 90 days with no activity.
Author
Owner

@almereyda commented on GitHub (Aug 7, 2024):

This was reproduced here.

@almereyda commented on GitHub (Aug 7, 2024): This was reproduced here.
Author
Owner

@almereyda commented on GitHub (Sep 29, 2024):

Related to #1369

@almereyda commented on GitHub (Sep 29, 2024): Related to #1369
Author
Owner

@kradalby commented on GitHub (May 5, 2025):

A beta with the new policy has been released, I think it should have improved the situation and would love to hear if this is still happening. https://github.com/juanfont/headscale/releases/tag/v0.26.0-beta.1

@kradalby commented on GitHub (May 5, 2025): A beta with the new policy has been released, I think it should have improved the situation and would love to hear if this is still happening. https://github.com/juanfont/headscale/releases/tag/v0.26.0-beta.1
Author
Owner

@masterwishx commented on GitHub (May 5, 2025):

I removed the old node as was unused, maybe I can reproduce it but not sure

@masterwishx commented on GitHub (May 5, 2025): I removed the old node as was unused, maybe I can reproduce it but not sure
Author
Owner

@almereyda commented on GitHub (Sep 23, 2025):

This behaviour disappeared here after reloading the Headscale service.

Suggesting to close, if this can be confirmed by the OP.

@masterwishx Would you like to take a moment to check if restarting your Headscale service makes the observed behaviour disappear for you, too?

@almereyda commented on GitHub (Sep 23, 2025): This behaviour disappeared here after reloading the Headscale service. - #2375 - #2389 Suggesting to close, if this can be confirmed by the OP. @masterwishx Would you like to take a moment to check if restarting your Headscale service makes the observed behaviour disappear for you, too?
Author
Owner

@masterwishx commented on GitHub (Sep 23, 2025):

Would you like to take a moment to check if restarting your Headscale service makes the observed behaviour disappear for you, too?

I moved back from 0.26 to 0.25 after all issues, also I have only one cloud server now becose oracle problems.

Can't remember if it worked after updated from 0.23, I will try to check with virtual machines.

@masterwishx commented on GitHub (Sep 23, 2025): > Would you like to take a moment to check if restarting your Headscale service makes the observed behaviour disappear for you, too? I moved back from 0.26 to 0.25 after all issues, also I have only one cloud server now becose oracle problems. Can't remember if it worked after updated from 0.23, I will try to check with virtual machines.
Author
Owner

@almereyda commented on GitHub (Sep 23, 2025):

Glad to hear it might be possible to roll back one version without breaking the database. Let me eventually try with #2785 some time and report back.

@almereyda commented on GitHub (Sep 23, 2025): Glad to hear it might be possible to roll back one version without breaking the database. Let me eventually try with #2785 some time and report back.
Author
Owner

@masterwishx commented on GitHub (Sep 23, 2025):

Glad to hear it might be possible to roll back one version without breaking the database

No, I have to use Backups to roll back

@masterwishx commented on GitHub (Sep 23, 2025): > Glad to hear it might be possible to roll back one version without breaking the database No, I have to use Backups to roll back
Author
Owner

@kradalby commented on GitHub (Dec 12, 2025):

Changes to separate the tags from users has been merged into main in #2885 and #2931. I will encourage you to help testing this if you are able to build main and run it.

I will close this to track progress, but there might still be bugs and the likes related to this change. As part of hardening this feature, we are tracking all related tags bugs over time in v0.28.0 milestone.

@kradalby commented on GitHub (Dec 12, 2025): Changes to separate the tags from users has been merged into `main` in #2885 and #2931. I will encourage you to help testing this if you are able to build `main` and run it. I will close this to track progress, but there might still be bugs and the likes related to this change. As part of hardening this feature, we are tracking all related tags bugs over time in [v0.28.0 milestone](https://github.com/juanfont/headscale/milestone/13).
Author
Owner

@masterwishx commented on GitHub (Dec 19, 2025):

Changes to separate the tags from users has been merged into main in #2885 and #2931. I will encourage you to help testing this if you are able to build main and run it.

I will close this to track progress, but there might still be bugs and the likes related to this change. As part of hardening this feature, we are tracking all related tags bugs over time in v0.28.0 milestone.

Updated to 0.26.1,0.27.1 and 0.28beta1 : cant get to vps node anymore , from main pc , ssh also not working :( ,
all my old nodes are not shown in list (becouse old versions of tailscale as wrote in docs) ,
seems something broken for 3 nodes im really working for now :

100.64.0.15  instance-mysite-cloud  userid:2147455555  linux    idle; offers exit node
100.64.0.7   myserver                        userid:2147455555  linux    active; relay "par", tx 7488 rx 0
100.64.0.2   desktop-mypc               userid:2147455555  windows  -

@masterwishx commented on GitHub (Dec 19, 2025): > Changes to separate the tags from users has been merged into `main` in [#2885](https://github.com/juanfont/headscale/pull/2885) and [#2931](https://github.com/juanfont/headscale/pull/2931). I will encourage you to help testing this if you are able to build `main` and run it. > > I will close this to track progress, but there might still be bugs and the likes related to this change. As part of hardening this feature, we are tracking all related tags bugs over time in [v0.28.0 milestone](https://github.com/juanfont/headscale/milestone/13). Updated to 0.26.1,0.27.1 and 0.28beta1 : cant get to vps node anymore , from main pc , ssh also not working :( , all my old nodes are not shown in list (becouse old versions of tailscale as wrote in docs) , seems something broken for 3 nodes im really working for now : ``` 100.64.0.15 instance-mysite-cloud userid:2147455555 linux idle; offers exit node 100.64.0.7 myserver userid:2147455555 linux active; relay "par", tx 7488 rx 0 100.64.0.2 desktop-mypc userid:2147455555 windows - ```
Author
Owner

@almereyda commented on GitHub (Dec 20, 2025):

Can you roll back to a snapshot or backup before the complete migration? Did the error appear already after one of the intermittent upgrades, or only at the end? Could you please open a separate issue with your questions and findings? Thanks.

@almereyda commented on GitHub (Dec 20, 2025): Can you roll back to a snapshot or backup before the complete migration? Did the error appear already after one of the intermittent upgrades, or only at the end? Could you please open a separate issue with your questions and findings? Thanks.
Author
Owner

@masterwishx commented on GitHub (Dec 20, 2025):

Can you roll back to a snapshot or backup before the complete migration? Did the error appear already after one of the intermittent upgrades, or only at the end? Could you please open a separate issue with your questions and findings? Thanks.

Thanks, sure I can.
For now I restored back to 0.25.1.

Will try again soon, maybe also related to headscale - admin 0.26 or/and headplane (that was need to be migrated to latest version).

Do you mean to check every update version if working fine?

@masterwishx commented on GitHub (Dec 20, 2025): > Can you roll back to a snapshot or backup before the complete migration? Did the error appear already after one of the intermittent upgrades, or only at the end? Could you please open a separate issue with your questions and findings? Thanks. Thanks, sure I can. For now I restored back to 0.25.1. Will try again soon, maybe also related to headscale - admin 0.26 or/and headplane (that was need to be migrated to latest version). Do you mean to check every update version if working fine?
Author
Owner

@almereyda commented on GitHub (Dec 21, 2025):

Let's continue in a separate issue. Please link it here or feel free to ping me there.

@almereyda commented on GitHub (Dec 21, 2025): Let's continue in a separate issue. Please link it here or feel free to ping me there.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: starred/headscale#673