Very helpful. I just changed the titles for my website because of this article. Since I use WordPress I also tried to change a few of the heading things, but I can’t seem to get it right like you either.
I think the majority of theses issues come down to time for web designers and web design firms. How far is a web designer to go when providing some SEO integration into a site. If a client’s not paying for the service then it just doesn’t make good business sense for a web designer to go spending time on adding all these extra bits to make the difference. Plus if you ask the client they’ll most likely tell you how you should be implementing SEO into their site. After all they have a computer sitting on their desk which qualifies them as an SEO guru especially after reading some bogus SEO article in their local mag from some wannabe SEO guy.
In my experience, the majority of businesses want a quick solution for nix, so in conclusion you get what you pay for.
Because bad practice is bad practice whichever way you cut it. Just because you might be able to get away with it, doesn’t mean you should.
If your competition is cutting corners by writing crappy titles and dropping alts it’s up to you to educate your clients to those facts. Explain to them exactly how your work is better, and how that will translate into the success of their site.
Likewise, if you were a builder, and your competition used low quality pine rather than hardwood, you should clarify how your competitions savings are being made, and how that will effect the longevity, performance and overall quality of the final product.
Most of these things take no more time at all. Would you need to add another day of development just to add meaningful Alt attributes to images?
Similarly you don’t save time by structuring your content badly – it still takes time to do that, so you might as well structure it well properly from the start.
This should be filed under “Web Design 101” instead of SEO. I can see AutisticCuckoo objecting to the use of “describe the image” rather than “provide alternative content” but that’s his call, not mine.
Although this is a decent article it is missing what I consider THE worst SEO mistake made today, plain and simple its CRAP code.
Yeah I’m the advocate of using compliant code for ALL the wrong reasons but I will say it for the millionth time, garbage code screws up your rankings and makes you work 200 times harder getting decent indexes.
If 99.9% of the so-called developers would get their head out of their butts and put the effort into W3C compliant, tableless XHTML code they wouldn’t have to work so damn hard at getting rankings! I have proven it time after time, streamlined code makes for better (and faster) rankings! Take my site for instance (in my siggie) … in under 7 months I have gone from not existing to PR4 across the board and I own over 10 top 3 positions for the keywords I wanted.
So yes this article helps but if you start with garbage code (all you DW and FP users) you are working 10 times harder to get your sites the rankings they deserve! Learn to code RIGHT and you will make your life 1000 times easier.
I am surprised the article did not mention the use of Flash. Flash is not indexed by search engines plus i hate it.
MSN recently started penalising sites that use link exchanges so i would add link exchanges to the list.
@dc dalton:
You are probably right, but it is also true that 100% w3c compliant code results in the site not being viewed correctly by the most popular browser IE.
But i agree all new sites should be made w3c compliant. Not really worth the investment for existing sites however.
I think it’s understandable by any person, noob or professional that flash isn’t readable by SE’s (one does though doesn’t it?) so I don’t really think that flash needed to be listed as there were more important things
Man do I hate flash as well. A flash banner would be alright, but some of those full flash templates from TemplateMonster make me sick
Dalton, seriously, I’ve have no problems getting #1 rankings with bad, table-full, markup. Clean code is good, but search engines exist to provide relevant results to their users, not to police the webmaster community for good coding practice.
As for this article… I agree with the others, pretty basic, mostly fluff. Maybe they’re 5 mistakes, but they aren’t the 5 worst ones or most common ones by any means. I’ve only been pondering the topic for 3 minutes and I’ve already thought of 4 that should be on there. I also find #5 a bit of a stretch. Misspellings can be good for SEO because quite frankly, users misspell all the time. Also considering how well some scraper sites can do which just mismash sentences together without any grammatical consideration, I doubt there is such a thing.
Finally, there is more than one language on this earth, and there is no reason why someone cannot use two languages on a webpage. For instance I could say soup du jour, instead of Soup of the Day, and that is french. I could say Hola, instead of hello, and that is Spanish. I could say Simba, instead of lion, and that is swahili. So are they really running a universal dictionary that compares all languages including common slang? I doubt it.
Obviously of course, if your site is every reviewed by a human, good grammar and spelling is important, but a human review and an automatic algorithm are not the same thing.
Besides, PageRank already takes this into account. If a site is less useful because of spelling mistakes it will naturally have less incoming links and naturally rank lower. If it is still useful regardless of spell mistakes it will not be so penalized. The cream rises to the top. No need to make generalized assumptions about site quality based on an arbitrary factor when you already have such a good quality indicator (incoming links).
Methinks (thats old english) I shall blog my own top 5 list.
Amen. Couldn’t agree with this more, I just came from another thread where Dalton was preaching about coding standards… Maybe you should build your own CSE for well coded sites?
I am not saying you cant get a site ranked well with garbage code, not saying that at all … what I am saying is it is easier to get clean coded sites ranked and it will happen quicker.
Is it because Google validates your HTML and gives you bonus points for having a strict DTD?
Or rather is it because well coded sites load faster & render more quickly thus creating a better user experience and garnering more incoming links as a result?
Just because something may make your site better, doesn’t mean search engines need to give a bonus for it. If it truly is better it will get more links and rank better naturally.
There are quiete a few more things to consider when making your site title.
-google only uses the first 8 words of your title in SERPs
-the example title you used might be better of using a pipe (|) rather then the word by, in order to get another word squeezed in the title
-When making a site tile you might want to weight the branding part against the contextual part. like “10 Things you didn’t know about Search Engine Optimisation | Google.com” versus “Google.com | 10 Things you didn’t know about Search Engine Optimisation”
Umm, this is not correct. IE has no trouble with valid code. It cannot render actual XHTML (which is why there is no real point in using it) but if you serve it as text/html or just use an HTML doc type, Internet Explorer has no trouble rendering it.
However there are quirks with how IE handles .css, maybe that is whet you meant.
just a hint about the spell checking… firefox has a built in spell-checker. In my opinion - it’s by far the most necessary tool for web design. especially with the new document.designMode implementation - it makes web page authoring an absolute breeze for the lamen (these being the people I’m directing this comment to).
The plugins - an utter plethora of which are available - and the management thereof are fantastic for someone like myself who has to develop in various technologies and languages… my favourites being ‘IE Tab’, ‘Dom Inspector’, ‘web developer’, and various different javascript debuggers, depending on what I’m looking for exactly. also check out things like MODIv2 - nifty dom inspector using javascript injection from a bookmark. That means you just bookmark it and it’s “installed” - very handy for someone looking through the heirarchy and properties of the various elements on a page, for whatever reason - whether it be SEO (search engine optimization) or simply just scripting.