www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Valid XHTML Pages

reply "Bob W" <nospam aol.com> writes:
I have just uploaded 2 samples for Digital Mars
and DMD index pages which fully comply to the
XHTML 1.0 Strict standard.

If it makes any sense I could also prepare four
XHTML compliant templates meant to accommodate
the contents of the rest of the DMD pages (or Ddoc
generated contents) - unless Walter has got different
plans.


The reason for this:

After testing several home pages last week, I felt that
DMD might want to place itself amongst the few elite
home pages being fully compliant with W3C standards.
At this moment DigitalMars and DMD home pages are
containing 34 and 49 errors respectively.

That actually compares favourably to the 1000+ validation
errors on pages of some of the most well known internet
players (CNET and Google news).

"Good guys" tested were IBM, Mozilla, Opera and
W3C itself, which all offer W3C conforming home pages.

While some "techies" such as Sun, AMD and HP had (still)
fewer errors on their pages than DigitalMars, companies
like Dell, AOL and Yahoo showed 100's of errors when I
have tested their home pages last week.


Now to the sensitive part of my survey - Linux vs. M$oft:

linux.org - 22 validation errors
linux.com -  76 validation errors

And this is the test result of the home page of the
company which has the reputation of rejecting all
public standards:

M$oft - 0 (repeat: zero) errors !!! (valid HTML 4.0)

Now stop whining Linus fans - here's the deal:
M$oft has to maintain zillions of web pages. Therefore
the odds of finding a non-conforming one are not
too bad. Just start digging!   ;-)
Oct 04 2005
next sibling parent "Bob W" <nospam aol.com> writes:
Here they are ... (attachment)
Oct 04 2005
prev sibling next sibling parent reply "Walter Bright" <newshound digitalmars.com> writes:
Thanks! I put in your changes to the Digital Mars home page. I partially did
it for the DMD home page, because doing it all the way means editting all
the files, and I'm not up for that at the moment. They're all generated from
the same set of macros.

"Bob W" <nospam aol.com> wrote in message
news:dhv3vs$2al3$1 digitaldaemon.com...
 I have just uploaded 2 samples for Digital Mars
 and DMD index pages which fully comply to the
 XHTML 1.0 Strict standard.

 If it makes any sense I could also prepare four
 XHTML compliant templates meant to accommodate
 the contents of the rest of the DMD pages (or Ddoc
 generated contents) - unless Walter has got different
 plans.


 The reason for this:

 After testing several home pages last week, I felt that
 DMD might want to place itself amongst the few elite
 home pages being fully compliant with W3C standards.
 At this moment DigitalMars and DMD home pages are
 containing 34 and 49 errors respectively.

 That actually compares favourably to the 1000+ validation
 errors on pages of some of the most well known internet
 players (CNET and Google news).

 "Good guys" tested were IBM, Mozilla, Opera and
 W3C itself, which all offer W3C conforming home pages.

 While some "techies" such as Sun, AMD and HP had (still)
 fewer errors on their pages than DigitalMars, companies
 like Dell, AOL and Yahoo showed 100's of errors when I
 have tested their home pages last week.


 Now to the sensitive part of my survey - Linux vs. M$oft:

 linux.org - 22 validation errors
 linux.com -  76 validation errors

 And this is the test result of the home page of the
 company which has the reputation of rejecting all
 public standards:

 M$oft - 0 (repeat: zero) errors !!! (valid HTML 4.0)

 Now stop whining Linus fans - here's the deal:
 M$oft has to maintain zillions of web pages. Therefore
 the odds of finding a non-conforming one are not
 too bad. Just start digging!   ;-)

Oct 04 2005
parent reply "Bob W" <nospam aol.com> writes:
"Walter Bright" <newshound digitalmars.com> wrote in message 
news:dhvleh$2s9v$1 digitaldaemon.com...

 Thanks! I put in your changes to the Digital Mars home page.

Congrat's - of the top 30 in the Fortune 500 only IBM's home page can now match the conformance quality of DigitalMars.com .
 I partially did it for the DMD home page,

Does not look like it (yet).
 because doing it all the way means editting all the files,
 and I'm not up for that at the moment. They're all generated
 from the same set of macros.

Do you still need the beforementioned templates or will they not fit into your macros/Ddoc scheme anyway?
Oct 05 2005
parent "Walter Bright" <newshound digitalmars.com> writes:
"Bob W" <nospam aol.com> wrote in message
news:di0dn8$hah$1 digitaldaemon.com...
 "Walter Bright" <newshound digitalmars.com> wrote in message
 news:dhvleh$2s9v$1 digitaldaemon.com...

 Thanks! I put in your changes to the Digital Mars home page.

Congrat's - of the top 30 in the Fortune 500 only IBM's home page can now match the conformance quality of DigitalMars.com .

The congrats go to you, because you did it!
 I partially did it for the DMD home page,


Haven't uploaded it yet.
 because doing it all the way means editting all the files,
 and I'm not up for that at the moment. They're all generated
 from the same set of macros.

they not fit into your macros/Ddoc scheme anyway?

They'll fit in. It's just that the documentation files are incompletely converted over to using the macros rather than the direct markup. They have to use the macros 100%, then converting to strict XHTML will work.
Oct 05 2005
prev sibling parent reply James Dunne <james.jdunne gmail.com> writes:
Bob W wrote:
 I have just uploaded 2 samples for Digital Mars
 and DMD index pages which fully comply to the
 XHTML 1.0 Strict standard.
 
 If it makes any sense I could also prepare four
 XHTML compliant templates meant to accommodate
 the contents of the rest of the DMD pages (or Ddoc
 generated contents) - unless Walter has got different
 plans.
 
 
 The reason for this:
 
 After testing several home pages last week, I felt that
 DMD might want to place itself amongst the few elite
 home pages being fully compliant with W3C standards.
 At this moment DigitalMars and DMD home pages are
 containing 34 and 49 errors respectively.
 
 That actually compares favourably to the 1000+ validation
 errors on pages of some of the most well known internet
 players (CNET and Google news).
 
 "Good guys" tested were IBM, Mozilla, Opera and
 W3C itself, which all offer W3C conforming home pages.
 
 While some "techies" such as Sun, AMD and HP had (still)
 fewer errors on their pages than DigitalMars, companies
 like Dell, AOL and Yahoo showed 100's of errors when I
 have tested their home pages last week.
 
 
 Now to the sensitive part of my survey - Linux vs. M$oft:
 
 linux.org - 22 validation errors
 linux.com -  76 validation errors
 
 And this is the test result of the home page of the
 company which has the reputation of rejecting all
 public standards:
 
 M$oft - 0 (repeat: zero) errors !!! (valid HTML 4.0)
 
 Now stop whining Linus fans - here's the deal:
 M$oft has to maintain zillions of web pages. Therefore
 the odds of finding a non-conforming one are not
 too bad. Just start digging!   ;-)
 

That was a completely pointless comparison between Linux and M$oft. Also, the linux.com home-page is " Copyright 2005 - OSTG, Inc., All Rights Reserved". Therefore, just another Fortune 500 company with less than perfect W3C compliance. After all, browsers are so messed up nowadays that pages complying to W3C standards don't mean jack squat unless the browsers are compliant.
Oct 05 2005
next sibling parent reply pragma <pragma_member pathlink.com> writes:
In article <di15in$172a$1 digitaldaemon.com>, James Dunne says...
 After all, browsers are so messed up 
nowadays that pages complying to W3C standards don't mean jack squat 
unless the browsers are compliant.

Please, don't take this rebuttal as hostile, as I respectfully disagree with this. Also, I think this is a discussion that has been overdue in this group, as others undoubtedly share this opinion. Others are encouraged to join in. :) For most people seeing this post, browsing the web from a desktop system using Mozilla/Firefox or IE is probably the only practial experience they will have with consuming the efforts of web developers. From this perspective, James has an extremely valid point; why go through the trouble when *every* known implementation of these standards is broken? After all, there are megabytes worth of blogs, websites and journals out there dedicated to this browser hack, or that browser bug... this is where real web developers live, right? The notion of compliance has goals and ramifications well beyond all of this. Take the following writeup for example: -- "HTML Standards Compliance - Why Bother ?" -- http://www.wdvl.com/Authoring/HTML/Standards/ This article pretty much hits all the important points, and even busts up a few common myths. There are other, more subtle points too, like Search Engine Ordering: http://www.beanstalk-inc.com/articles/se-friendly-design/w3c-compliance.htm Which basically says that spiders are technically browsers too, so rather than try to dig up what parsing quirks they have, you're better off staying standards compliant. The short, short version: sites that are more compliant, consistently rank higher than their peers. Another gem from this article, which is quickly brushed aside, is that there are other kinds of clients out there as well. PDAs, Cellphones and devices for disabled users can all consume webpages wether they were intended for them or not. While not as important as search engine ordering itself, it can only help if you're looking slightly less like garbage on somone's Nokia or Pocket PC. With respect to the Digitalmars' D site, and W3C compliance, here's my take on things. Being able to stamp "W3C compliant" all over the site shows that D is espoused by people who know what the heck they're doing, and do them well. Unlike most other self-applied labels, it can be independently validated by anybody, which is a huge plus; its not as arbitrary as say, the vendor-supplied benchmarks on your favorite graphics card. Also, a website for something as grass-roots run as D, is as much a PR effort as it is a place to lookup the Phobos API, or download the latest compiler. If there is one more thing that this whole community can do to put D's best foot a little farther forward, then why not? Who here wouldn't want to see D Devlopment Kits for Webservers, Palmtops, Pocket PCs or Game Consoles? Well, that's not going to happen if would-be D hackers, open-minded CTO's and IT Buisness Managers can't even take the site as seriously as we do. In short: we can't just make *D* better than everyone else out there, we have to make *everything related to D* better. Standards compliance is a big part of that. :) - EricAnderton at yahoo
Oct 05 2005
next sibling parent "Walter Bright" <newshound digitalmars.com> writes:
You made the case better than I. Thanks!
Oct 05 2005
prev sibling parent reply James Dunne <james.jdunne gmail.com> writes:
pragma wrote:
 In article <di15in$172a$1 digitaldaemon.com>, James Dunne says...
 
After all, browsers are so messed up 
nowadays that pages complying to W3C standards don't mean jack squat 
unless the browsers are compliant.

Please, don't take this rebuttal as hostile, as I respectfully disagree with this. Also, I think this is a discussion that has been overdue in this group, as others undoubtedly share this opinion. Others are encouraged to join in. :) For most people seeing this post, browsing the web from a desktop system using Mozilla/Firefox or IE is probably the only practial experience they will have with consuming the efforts of web developers. From this perspective, James has an extremely valid point; why go through the trouble when *every* known implementation of these standards is broken? After all, there are megabytes worth of blogs, websites and journals out there dedicated to this browser hack, or that browser bug... this is where real web developers live, right?

Don't get me wrong; I'm all for standardization of things which need to be standardized. I just feel that the web, as it has grown to be today, has lost its footing here. So many companies and users are abusing what was meant to be an extremely simple protocol for delivering marked up static content. This standardization push has come due to overwhelming demand for the ability to do stupid things: stateful application delivery over a stateless, one-shot connection protocol. I'm saying this and I work with ASP.Net in my daily job. But not once in the whole process of developing our site was anyone concerned with following W3C standards - the push was on getting a correctly rendered site in Internet Explorer (the target audience). No sane web developers (IMO, of course) are going to waste time attempting to force W3C compliance of their site for no real, foreseeable gain. BTW, we're pushing out a culinary site, so I'm not sure how well blind users are suited to practice the culinary arts, let alone browse our site. =P
 The notion of compliance has goals and ramifications well beyond all of this.
 Take the following writeup for example:
 
 -- "HTML Standards Compliance - Why Bother ?" --
 http://www.wdvl.com/Authoring/HTML/Standards/
 

Have yet to read this...
 This article pretty much hits all the important points, and even busts up a few
 common myths.  There are other, more subtle points too, like Search Engine
 Ordering:
 
 http://www.beanstalk-inc.com/articles/se-friendly-design/w3c-compliance.htm
 
 Which basically says that spiders are technically browsers too, so rather than
 try to dig up what parsing quirks they have, you're better off staying
standards
 compliant.  The short, short version: sites that are more compliant,
 consistently rank higher than their peers.

This is purely a side-effect of the HTML parsing engine used within the spider. If the spider program is actually written to reject pages that are not standards compliant or which have minor errors, then what is the point of the spider? In fact, a spider should not care about the actual content of the page, how it is laid out in HTML, or what type of content it is. It should simply crawl for search terms or links, and index the page accordingly.
 Another gem from this article, which is quickly brushed aside, is that there
are
 other kinds of clients out there as well.  PDAs, Cellphones and devices for
 disabled users can all consume webpages wether they were intended for them or
 not.  While not as important as search engine ordering itself, it can only help
 if you're looking slightly less like garbage on somone's Nokia or Pocket PC.

I understand these points perfectly well. But most PDAs, cellphones, and other "web-enabled" devices are borrowing already written browser code, say from Mozilla or Internet Explorer, inheriting all their incompatibilities and non-conformance issues. Of course, there are the select few that would write an embedded browser and do it the right way from the start, but the number of sites (I believe you called them 'elite') out there that are actually practicing conformance are very few in number, and thus implementing a fully standards-compliant browser is meaningless unless it correctly renders all the *incorrect* HTML/XHTML/CSS/whatever the case may be, out there, in addition to the W3C compliant pages.
 With respect to the Digitalmars' D site, and W3C compliance, here's my take on
 things.
 
 Being able to stamp "W3C compliant" all over the site shows that D is espoused
 by people who know what the heck they're doing, and do them well.  Unlike most
 other self-applied labels, it can be independently validated by anybody, which
 is a huge plus; its not as arbitrary as say, the vendor-supplied benchmarks on
 your favorite graphics card.

"People who know what the heck they're doing" is a very relative phrase. Walter obviously knows what the heck he's doing in compilers and language design, otherwise there'd be no D language or D reference compiler. However, I think it's been abundantly clear that he didn't know "what the heck he was doing" in terms of the site management and W3C compliance - no offense to Walter, since I obviously find such things irrelevant =). These two areas of "people who know what the heck they're doing" are radically different and are practically incomparable, and to lump them together to some sort of collective measure of intelligence doesn't make any sense. On your 2nd point, I would actually value the vendor-supplied benchmark of my favorite graphics card higher than something as abstract as W3C compliance of a site. Better benchmarks and proof of performance mean a lot more to me ;). And actually, if I had the card myself, I could run the same benchmarks as the vendor did and validate the results for myself. Lastly, stamping "W3C compliant" all over the site would lower the signal-to-noise ratio of the site ;)
 Also, a website for something as grass-roots run as D, is as much a PR effort
as
 it is a place to lookup the Phobos API, or download the latest compiler.  If
 there is one more thing that this whole community can do to put D's best foot a
 little farther forward, then why not?

Why not? Simply a matter of time, and that I believe there is no real gain in doing so. Obviously there is no set release schedule that is publically available for the D reference compiler or its corresponding language reference documentation, but I believe time would be better spent on improving and fixing the actual content of the product to be delivered rather than the presentation medium.
 Who here wouldn't want to see D Devlopment Kits for Webservers,
 Palmtops, Pocket PCs or Game Consoles?

 Well, that's not going to happen if would-be D hackers, open-minded
 CTO's and IT Buisness Managers can't even take the site as
 seriously as we do.

If these people are seriously concerned with such insignificant and pointless trite as standards compliance on the web, then they are not worth our time. =)
 In short: we can't just make *D* better than everyone else out there, we have
to
 make *everything related to D* better.  Standards compliance is a big part of
 that. :)
 
 - EricAnderton at yahoo

My conclusion: The problem is that HTML started out as an unstrict standard, and trying to later force an unstrict standard to become a strict standard is pointless, since all the existing content complying to that original standard is obviously not strict and forcing it to become strict would break a lot of things. Then comes the addition of the DOCTYPE tag which specifies to which standard the content tries to comply with (grossly oversimplified to either a strict one or an unstrict one). This is also effectively meaningless, since the content creator can simply chose to comply to an unstrict standard, thereby rendering the stricter standards meaningless. You are free to disagree here, and I do see a few areas here where I can be wrong or misleading. In summary, if you can specifically name several *actual* reasons why W3C compliance will produce significant gains and how it could somehow radically change the web and fix all the broken content in it, then I'll consider it. =)
Oct 06 2005
parent =?ISO-8859-1?Q?Jari-Matti_M=E4kel=E4?= <jmjmak invalid_utu.fi> writes:
James Dunne wrote:
 My conclusion:
 
 The problem is that HTML started out as an unstrict standard, and trying 
 to later force an unstrict standard to become a strict standard is 
 pointless, since all the existing content complying to that original 
 standard is obviously not strict and forcing it to become strict would 
 break a lot of things.
 
 Then comes the addition of the DOCTYPE tag which specifies to which 
 standard the content tries to comply with (grossly oversimplified to 
 either a strict one or an unstrict one).  This is also effectively 
 meaningless, since the content creator can simply chose to comply to an 
 unstrict standard, thereby rendering the stricter standards meaningless. 
  You are free to disagree here, and I do see a few areas here where I 
 can be wrong or misleading.
 
 In summary, if you can specifically name several *actual* reasons why 
 W3C compliance will produce significant gains and how it could somehow 
 radically change the web and fix all the broken content in it, then I'll 
 consider it. =)

In the case of Digitalmars website I hope that this new standards compliance helps Walter achieve more. Cascading style sheets save a lot of developer time thus allowing Walter to focus more on writing the compiler even better. Besides centralized style sheets decrease web server bandwidth usage. I hope you can find some answers here: http://www.w3.org/MarkUp/2004/xhtml-faq
Oct 06 2005
prev sibling parent "Walter Bright" <newshound digitalmars.com> writes:
"James Dunne" <james.jdunne gmail.com> wrote in message
news:di15in$172a$1 digitaldaemon.com...
 After all, browsers are so messed up
 nowadays that pages complying to W3C standards don't mean jack squat
 unless the browsers are compliant.

The browsers are getting better at this. The reasons for W3C compliance are: 1) It presents a professional image for D. 2) In the future, I expect that browsers will become more demanding of W3C compliance. Might as well fix the pages in the general course of updating them. 3) It makes D more accessible to blind users (yes, they exist!). 4) It may improve the pagerank of D in Google. 5) It makes it more likely that web crawlers and specialized browsers can handle the D pages correctly. The downside of W3C compliance is it is just so darned ugly looking. Fortunately, Ddoc can hide most of that.
Oct 05 2005