DNT headers are not by themselves legally binding. However they can rightfully be considered an indication of a user's preferences (kind like how the OS language settings can be an indication of what language a user wants a website to use).
What most people miss about the GDPR is that most of it (as well as the ePrivacy Directive covering more technical aspects like cookies) really only exists because of the one big thing at its core most people are either not aware of or intentionally omitting:
The GDPR establishes a user's right to ownership and control of their personally identifiable information as an inalienable and irrevocable fundamental human right. This is what makes all the rest of it necessary: it's not about "cookie banners", it's about requiring others to obtain consent for what they want to do with that information; it's not about writing "privacy policies", it's about explaining what you do with that information and how you guarantee their rights are respected by you and disclosing who you're passing it on to and how you're ensuring they too respect those rights.
The alternative to consent dialogs (whether as "pop-ups" or via confirmations when prompting for relevant information) would be requiring every website to have a written contract with each user. Consent is only valid if it is demonstrably informed (and non-coerced but that's a different story) and it must be specific and revocable. You can't have users blanket opt-in to everything you'd like - they wouldn't even know what consent they'd need to withdraw later if they reconsider.
By the way, courts recently seem to have started ruling that the way many AIs work the companies training them are in violation of copyright laws by using intellectual property as training data without permission and in order for contracts to be legally binding, anything given by one party has to be given consideration by the other (i.e. anything of value given by one party has to be balanced out with something of value given by the other party) - so I wouldn't be too quick to ridicule the idea that using Facebook means Facebook can do with your data whatever its terms of service say they can do, even if posting on Facebook can probably not be considered an effective way of informing Meta about your disagreement.
And, they could have been made legally binding. The EU established a precedent of requiring consent for tracking afterwards after the ship had sailed on this technology.
The only thing required to make a signal like that legally binding is the power of law. It just wasn't there for DNT.
What most people miss about the GDPR is that most of it (as well as the ePrivacy Directive covering more technical aspects like cookies) really only exists because of the one big thing at its core most people are either not aware of or intentionally omitting:
The GDPR establishes a user's right to ownership and control of their personally identifiable information as an inalienable and irrevocable fundamental human right. This is what makes all the rest of it necessary: it's not about "cookie banners", it's about requiring others to obtain consent for what they want to do with that information; it's not about writing "privacy policies", it's about explaining what you do with that information and how you guarantee their rights are respected by you and disclosing who you're passing it on to and how you're ensuring they too respect those rights.
The alternative to consent dialogs (whether as "pop-ups" or via confirmations when prompting for relevant information) would be requiring every website to have a written contract with each user. Consent is only valid if it is demonstrably informed (and non-coerced but that's a different story) and it must be specific and revocable. You can't have users blanket opt-in to everything you'd like - they wouldn't even know what consent they'd need to withdraw later if they reconsider.
By the way, courts recently seem to have started ruling that the way many AIs work the companies training them are in violation of copyright laws by using intellectual property as training data without permission and in order for contracts to be legally binding, anything given by one party has to be given consideration by the other (i.e. anything of value given by one party has to be balanced out with something of value given by the other party) - so I wouldn't be too quick to ridicule the idea that using Facebook means Facebook can do with your data whatever its terms of service say they can do, even if posting on Facebook can probably not be considered an effective way of informing Meta about your disagreement.