The search engine’s job is to match visitors with sites worth visiting. They prefer the same things that people like: compelling content from credible sources. From a SEO perspective (which, of course, is a limited view), all a site developer has to do is get out of the way of the content.
Here is a list of basic, must-do SEO tips:
Don’t put too much emphasis on graphics, photos, flash, java or video. If you use graphic elements, make sure that ever item has a completed “Alt” tag. Make sure the CMS and media management tools allows producers to create custom alt tags and identify the non-searchable content.
Offer the search bots up-to-date sitemaps and make sure that every bit of content can be spidered and indexed. Bots crawl the site by going from link to link. This means developers and producers must build a strong internal link network so there are lots of links to every piece of content on your site. Especially, don’t hide navigation links in uncrawlable javascript.
Clean URLs are important. Make sure that your site doesn’t have complicated URLs. If they make sense to a person, they’ll make sense to a bot. Try to get keywords in the URLs and try to get them as close to the front as possible. Avoid more than one directory in your URLs. Use lowercase text and separate words with hyphens. Use static URLs instead of dynamic URLs. Use the “no-follow” links on comments, and “canonical” links on pages that may be considered duplicate content by the bots. Use absolute URLs where possible.
Make sure staff can control key SEO elements on every page. Editors should be able to customize the page title, meta-description, meta-keywords, image names, image alt tags, image captions, categories, and tags. Do not use the same titles or any other element on every page.
The robots.txt file tells the bots what to index and what to ignore. Use it wisely. The same applies to htaccess files.
Bots read a little HTML. Use validated HTML, with only one H1 tag per page, subheds in H2 and H3 tags, etc.
Avoided “cloaking” like the plague. Any attempt to hide content or display different page versions to different users can trigger a warning from Google.
Everyone likes a fast-loading site. Bots, like people, get distracted easily and if your web page is slow to load, it may not get read or indexed. Google, in particular is now docking slow-loading sites and boosting sites that use good security (https).
Want more info on site-building?
Here are my favorite search optimization resources. And here’s a great periodic table of search engine factors and, of course, the most important SEO tool in the world.
Want more SEO best practices for editors?
Here is how to use search tools when making editorial decisions.