Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

So is this basically a safe version of innerHTML?


Yes, although a slightly more relevant way of putting it would be that it's an inbuilt DOMPurify (dompurify being an npm package commonly used to sanitize html before injecting it).


Is this basically doing the same thing as https now? But for http, and firefox just never implemented a simple fix for it's entire existence until now?

I obviously know nothing about this, but I still find it fascinating. Or am I off my block.


XSS isn't related to https/ssl, ssl is the secure conncetion between you and the server, but xss is the injection of data into the site which will be executed in your browser in this case. The connection isnt relevant.

https://developer.mozilla.org/en-US/docs/Web/Security/Attack...


This has nothing whatsoever to do with http.


I'm confused as to why you need a "safe" version if you're the one generating and injecting the HTML.


As it turns out, verifying that HTML is safe to render without neutering HTML down to a whitelist of elements is actually quite difficult. That's not great when you're rendering user-generated content.

Solutions in the form of pre-existing HTML sanitisation libraries have existed for years but countless websites still manage to get XSS'd every year because not everyone capable of writing code is capable of writing secure code.


Isn't this kinda like asking "why does my gun need a safety if I'm the only one consciously pulling the trigger"?


Because sometimes you generate html based on user input and upwards of 98% of web related security vulnerabilities have been this.


It was kind of strange to have bbcode and wiki markup specifically to avoid allowing users to use html.


Gruber’s original markdown tool passes HTML straight through, it was designed to make writing long-form content easier.

Markdown implementations can do any of that, only allowing a whitelist of HTML elements (GFM), or not allowing HTML at all.


1. Because you commonly are not.

2. Because it’s really easy to fuck up and leak attacker controlled content in markup, especially when the environment provides tons of tools to do things wrong and none to do things right. IME even when the environment provides tons of tools to do things right it’s an uphill battle (universe, idiots, yadda yadda).


If you generate it from completely static and known values, have at it.

If you include user-provided data, then you should sanitize it for HTML.


Why should a web page only have a single person generating and injecting HTML into it?


The analogy doesn't hold markup ;)

Whether I generate a whole page or generate a partial page and then add HTML to it is equivalent from a safety perspective.


A single company. Why would I let another company inject HTML into my page?


There's this newfangled concept called social media where you let other people post content that exists on your web site. You're rarely allowed to post HTML because of the associated issues with sanitizing it. setHTML could help with that.


I just had a flashback to the heyday of MySpace. Now that I think about it though, Neocities has the "social networking" of being able to discover other people's pages and give each other likes and comments.

Hmmm...


Or CMS content, or even anything that comes from the user outside of social media content and could cause a reflected XSS

for example, a search query, or a redirect url, or a million other things


It is to render untrusted (user-generated) HTML without letting them slip in markup like script tags that could harm other users.


Markdown parsers output unsafe HTML.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: