SitePoint Sponsor

User Tag List

Page 2 of 2 FirstFirst 12
Results 26 to 33 of 33
  1. #26
    SitePoint Author silver trophybronze trophy

    Join Date
    Nov 2004
    Location
    Ankh-Morpork
    Posts
    12,158
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Quote Originally Posted by arkinstall View Post
    I'd agree that serving XHTML as XHTML is impractical, because of the scum-of-the-browser-world called IE.
    But if you're not serving it as X(HT)ML, then you're not using XHTML at all. Any arguable benefits go right out the window; you're using invalid HTML, nothing more, nothing less.

    The only 'benefit' left is that a validator – should you remember to use one – will point out some completely unnecessary tags that you may have missed.

    Quote Originally Posted by arkinstall View Post
    I still can't believe that the web development community is just giving up with it because of IE; due to that attitude XHTML will never reach potential.
    Anything with draconian error handling is unlikely to have any potential for end users. Most authors are not programmers, after all. They think HTML Transitional is too strict, for Pete's sake!

    Quote Originally Posted by arkinstall View Post
    As for uploading XHTML files through FTP... seems a bit caveman to me really.
    One word: Dreamweaver. Lots of people use it and the default setting (IIRC) is to upload on save.

    Quote Originally Posted by arkinstall View Post
    If you don't check things BEFORE you put them on a live site (I use a development subdomain) then you really need your head checking.
    That may work if you publish a page every other fortnight or so, but not if you have a busy news site where dozens of people publish a hundred articles a day. They'll use a CMS, of course, but that CMS had better guarantee well-formed markup if it's using XHTML...

    Quote Originally Posted by arkinstall View Post
    I really don't think I need to tell you this as it's common sense, but judging by your comments you're forgetting it.
    I'm not forgetting it, and it is common sense for a programmer. But it's not necessarily so for a copywriter. Spellchecking? Yes, but that's about it.

    Quote Originally Posted by arkinstall View Post
    Sure, XHTML needs validation.
    No more than HTML needs validation. XHTML needs to be well-formed, though. Browsers will have to cope with invalid XHTML as long as it's well-formed. For instance, <span><h2>I'm Stupid</h2></span>.

    Quote Originally Posted by arkinstall View Post
    I see that as a plus, and I always will. Would you write a server-side application and upload it to a live site without checking it? Would you write a C++ application and compile it, then distribute it without checking for errors (hypothetically speaking)?

    If you say no, and I do hope you would, why on earth would you forget that when marking up content?
    No, I wouldn't deploy an application without checking it first. Why? Because debugging, fixing the error, recompiling, testing and redeploying would take a long time, during which the app would be inaccessible to my users. In a professional environment, it would also normally have to go through several testing stages, and the programmer wouldn't be allowed to deploy his/her own app to the acceptance test server or the production server.

    If there's a minor error in document markup, fixing it and re-uploading it is a matter of seconds. That's why content writers may not apply the same rigorous deployment procedures as programmers.

    Quote Originally Posted by arkinstall View Post
    XHTML simply removes the bubble wrap from the katana blade that is markup.
    That's the wrong analogy. XHTML simply adds a few bells and pink ribbons to the katana blade, in the form of sprinkled slashes and unnecessary end tags.
    Birnam wood is come to Dunsinane

  2. #27
    Programming Since 1978 silver trophybronze trophy felgall's Avatar
    Join Date
    Sep 2005
    Location
    Sydney, NSW, Australia
    Posts
    16,815
    Mentioned
    25 Post(s)
    Tagged
    1 Thread(s)
    Quote Originally Posted by AutisticCuckoo View Post
    The only 'benefit' left is that a validator should you remember to use one will point out some completely unnecessary tags that you may have missed.
    I disagree that those tags are completely unnecessary. Having those tags there can save a lot of time with future maintenance of the page as with them there the web page source is much easier to read. Those tags are ones which a browser might need to consider as optional because of all the poorly written pages but which ought to be included in any properly written page.

    One of the problems with the W3C standards is that they are standards for the browser writers and do not therefore include the other 70% or so of the standards that web page authors ought to be following.

    I don't think anyone has suggested that non-programmers should be expected to follow such strict rules but then the XHTML rules are relatively lenient compared to what can happen with a lot of small typos in proper programming code and so programmers are used to being a lot more precise in their code than other people.

    Validating against XHTML instead of HTML for a page to be served as HTML is worth it just for it identifying if you have left out or misplaced any of those extremely useful tags that make the page so much easier to maintain that you mistakenly suggested are unnecessary.
    Stephen J Chapman

    javascriptexample.net, Book Reviews, follow me on Twitter
    HTML Help, CSS Help, JavaScript Help, PHP/mySQL Help, blog
    <input name="html5" type="text" required pattern="^$">

  3. #28
    SitePoint Author silver trophybronze trophy

    Join Date
    Nov 2004
    Location
    Ankh-Morpork
    Posts
    12,158
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Quote Originally Posted by felgall View Post
    I disagree that those tags are completely unnecessary.
    They are technically unnecessary, since their presence can be implied – unambiguously. Thus a user agent doesn't need them. I agree that they are useful for human readers, though. And that's why I do use them, even though they're unnecessary.

    Omitting tags was considered useful back when SGML and HTML first came about. Bandwidth was precious, as was storage space and memory. Why write any more than you have to?

    These days those concerns are mainly moot, except perhaps for some mobile devices. The only reason to omit tags today is probably laziness.

    Quote Originally Posted by felgall View Post
    Validating against XHTML instead of HTML for a page to be served as HTML is worth it just for it identifying if you have left out or misplaced any of those extremely useful tags that make the page so much easier to maintain that you mistakenly suggested are unnecessary.
    I've said it before, but I'll say it again: validators should only look for syntax errors; if you want something to enforce what current fashion decides is 'best practice' you should use something like lint(1).

    Omitting a </p> tag in HTML is not an error, because it is 100% clear where the paragraph ends, even without the end tag. Using an explicit end tag helps human readers, though, which is why it might be a good idea to write them out, but they are not strictly necessary for parsing the document. And HTML documents are mainly intended to be read by machines, not people.
    Birnam wood is come to Dunsinane

  4. #29
    Programming Since 1978 silver trophybronze trophy felgall's Avatar
    Join Date
    Sep 2005
    Location
    Sydney, NSW, Australia
    Posts
    16,815
    Mentioned
    25 Post(s)
    Tagged
    1 Thread(s)
    Quote Originally Posted by AutisticCuckoo View Post
    And HTML documents are mainly intended to be read by machines, not people.
    Except when you are the author of the page and are using an editor where you get to work on the tags directly where you want the code to be as easy to read and hence as easy to maintain as possible.

    Of course it would be possible to set up programs to do the conversions back and forth between browser optimised HTML with all the optional tags stripped out and author optimised XHTML. Then we'd have the best of both worlds with it. I wonder why such a useful program has never been produced (or if it has why it hasn't been properly publicised). Even just a lint program that reads XHTML and spits out web optimised HTML would be useful.

    Equally useful would be if they ported all the useful stuff from XHTML 1.0 back into HTML and called it HTML 4.1 - all it would need is a new doctype tag to identify what follows as real HTML rather than its being HTML with errors, nothing in the actual code or validation thereof would need to change from what currently exists for XHTML 1.0 except that it would no longer be necessary to use pretend HTML in order to do the logical thing and close all tags.
    Stephen J Chapman

    javascriptexample.net, Book Reviews, follow me on Twitter
    HTML Help, CSS Help, JavaScript Help, PHP/mySQL Help, blog
    <input name="html5" type="text" required pattern="^$">

  5. #30
    Follow: @AlexDawsonUK silver trophybronze trophy AlexDawson's Avatar
    Join Date
    Feb 2009
    Location
    England, UK
    Posts
    8,111
    Mentioned
    0 Post(s)
    Tagged
    1 Thread(s)
    I have stated in many threads my preference towards XHTML, however unlike many, I serve my website correctly... XHTML 1.1 if browsers support it but for IE and the latter... HTML 4.01. I do happen to like the draconian error handling as I come from a programming background and the thought of syntax errors or invalid code makes my spine crawl Either way the way I choose to look at it is... Browsers which do have support for XHTML get the functionality and for browsers like IE which cannot handle the XHTML functionality they get a small but visible message at the top of the window saying "This website could perform better but Internet Explorer currently does not have the ability to take advantage of this website effectively, to see this site in all its glory perhaps try another browser!" (a little yellow highlighted popup) and PHP deals with the checks and triggers to make sure those that support it get what they can. It is simple to implement, and best of all, it means that I do not have to be inhibited by what code is implemented as I know my website will outright work (progressive enhancement) on a variety of platforms

  6. #31
    Theoretical Physics Student bronze trophy Jake Arkinstall's Avatar
    Join Date
    May 2006
    Location
    Lancaster University, UK
    Posts
    7,062
    Mentioned
    2 Post(s)
    Tagged
    0 Thread(s)
    I know my website will outright work
    Browser verification isn't always reliable. What if a simple browser inside an application is based around IE (as, unfortunately, a few ISPs supply), but is programmed to send a different browser name?

    Then the user wouldn't be able to view your site at all.
    Jake Arkinstall
    "Sometimes you don't need to reinvent the wheel;
    Sometimes its enough to make that wheel more rounded"-Molona

  7. #32
    SitePoint Author silver trophybronze trophy

    Join Date
    Nov 2004
    Location
    Ankh-Morpork
    Posts
    12,158
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    I assume Alex examines the Accept HTTP header, rather than rely on primitive browser sniffing. That's what I do on my blog.

    That has the unfortunate effect of serving XHTML to Safari and Chrome, which state that they prefer application/xhtml+xml over text/html, yet they don't fully support XHTML. Oh well, that's not really my fault, is it?
    Birnam wood is come to Dunsinane

  8. #33
    Follow: @AlexDawsonUK silver trophybronze trophy AlexDawson's Avatar
    Join Date
    Feb 2009
    Location
    England, UK
    Posts
    8,111
    Mentioned
    0 Post(s)
    Tagged
    1 Thread(s)
    Quote Originally Posted by AutisticCuckoo View Post
    I assume Alex examines the Accept HTTP header, rather than rely on primitive browser sniffing. That's what I do on my blog.
    Yep, that's exactly how I go about it, it would not make sense to sniff browsers when spoofing is so easily achieved


Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •