Timeline for Why no client-side HTML include tag?
Current License: CC BY-SA 2.5
13 events
| when toggle format | what | by | license | comment | |
|---|---|---|---|---|---|
| Feb 13, 2016 at 3:29 | comment | added | supercat | @bobince: As I would want it implemented, an attempt to include any resource from a server which didn't respond to a request "Page XX wants to include resource YY" with "Here's resource YY, which I want to make availabe to page XX" would be blocked by the browser. If the server explicitly wants to make something available for client-side includes, I don't see how that should cause problems with client-side request forging if servers don't accept include requests for content which shouldn't be freely available. | |
| Sep 9, 2014 at 7:47 | comment | added | bobince | Nested resources aren't a problem. Client side request forgery is the problem. Putting includes in the parent DOM is worse than an <img> because you can't read the content of an <img>. (You can find out whether the target resource is a valid image and if so how big it is, and there was a worse attack with stylesheet inclusion which the browsers have largely worked around; these are bad things, which partially break the Same Origin Policy and wouldn't have been allowed by default if we'd had CORS at the time. But nothing like as bad as reading the entire content of the target via DOM.) | |
| Sep 8, 2014 at 16:57 | comment | added | supercat | @bobince: If an HTML-include request could forbid the included HTML from including any nested resources which weren't white-listed by the include request itself, and applied a prefix to the ID attribute of all included items, how would it be any worse than <img> or (dare I say it) JavaScript? | |
| Sep 8, 2014 at 16:24 | comment | added | bobince | @supercat: there are other forms of authentication, such as connection-level (HTTPS client certs and IWA) and location-based (eg pages on an intranet which you can access from your browser in a secure network - don't want that to leak to the outside). Adding ‘unauthenticated’ HTTP connections to browsers is tricky—just look at the problems Java applets and Flash caused by doing something similar with their own network stacks. | |
| Sep 8, 2014 at 16:13 | comment | added | supercat | ...access to cookies and authentication tokens (even if a site used cookies to select among different "skins", one could simply have the generated HTML for the site vary the path to the included file based upon the chosen skin. The first page loaded after changing skins would have to fetch a new include file for that skin, but later pages could use the new cached one; the earlier skin would remain in the cache, and would appear if any pages that haven't been revisited since the skin change are viewed in "off-line" mode. | |
| Sep 8, 2014 at 16:07 | comment | added | supercat | @bobince: Is there any reason that the request on the client side would have to include cookies and HTTP authentication tokens? The primary usage scenario I would see for client-side includes would be to improve caching of static page content. If sixteen pages all include the same header and footer, using a client-side include would increase the time required to load the first, but reduce the time to load the remaining fifteen. The usage cases where the include would be the most helpful would be precisely those where the data to be "included" would be static and thus not need... | |
| Oct 1, 2010 at 12:44 | comment | added | bobince | You can try to fetch a bank page on the server side but your request will be unauthenticated so you can't download any juicy information. A request from the client side includes cookies and HTTP authentication tokens; if you can read the response from such a request you can fully impersonate the user. | |
| Sep 30, 2010 at 22:37 | comment | added | serg | I don't get how it is a security disaster. I can read a bank page on server side right now and spit it out on another page - is it a disaster? Maybe, but certainly not security related one. Security disaster would be reading cookies from a different domain. Without this client side include would be exactly the same as server side one. Don't see any problem here. | |
| Sep 27, 2010 at 22:05 | history | edited | AArteDoCodigo.com.br - Maniero | CC BY-SA 2.5 | Delete "lazy" word which could be offensive. |
| Sep 25, 2010 at 15:23 | comment | added | Jé Queue | It would do significantly more than the server. I'm not sure why it would need to block page load, it could have allowed for full page load with async content-fill. Of course it could be limited by browsers to only allow pull from originating servers or to allow a domained DOM. | |
| Sep 25, 2010 at 10:22 | comment | added | Alan Pearce | Actually, the client (or a proxy) could cache more efficiently, as templates (or header/footer includes) don't tend to change from page to page, meaning the user might at least be able to see part of the page whilst some server-side processing is going on. | |
| Sep 25, 2010 at 10:07 | history | edited | Wizard79 | CC BY-SA 2.5 | added 16 characters in body |
| Sep 25, 2010 at 8:01 | history | answered | bobince | CC BY-SA 2.5 |