Skip to content

Checklist for SEO-compliant CMS templates

  • 6 min read
  • by
code 1076536 1920 - Checklist for SEO-compliant CMS templates

Unfortunately it happens often that a search engine optimizer is only switched on when it is already too late – that is, when the website is finished and online. But to call a SEO only after the go-live is much too late! Because afterwards the smallest changes often mean a lot of work, all too often you hear “that’s not possible” or “that doesn’t pay off” and neither the SEO nor the client will be happy with such a result in the long run.

The right Content Management System can help

Unfortunately, most Content Management Systems in their ‘out-of-the-box state’ are not suitable for a decent foundation in terms of SEO. But I don’t want to ride around on the shortcomings of TYPO3 and Co. and I don’t want to offer a choice in CMS. The requirements and needs are just too different and the choice is too big. By the way … shop systems are by the way often much worse to act in terms of technical search engine optimization.

From my personal experience I can recommend WordPress with the right plugins to get smaller sites up and running quickly SEO-fit. For larger sites or requirements it is best to evaluate them yourself and decide which CMS is best for you.

Each relaunch is a new chance!

So you are planning to make the whole site new?

No matter whether the existing CMS is to be modernized with a new template or everything is built from scratch: This is your chance to do many things right! The search engines will love you! 🙂

But what is important for the SEO-conform development of a CMS template?

Ensure indexation

It won’t help you if you have the best content or greatest products on their site but the search engine can’t find them or problems with indexing prevent a competitive ranking.

  • Avoid the use of framesets at all costs. Hopefully no further explanation is necessary
  • Avoid the use of Flash. Flash is dead. Don’t argue. At least since the iPad!
  • Use structured markup! HTML offers a variety of tags to mark up specific content – use them!
  • Store graphical menu items with machine readable text using CSS image replacement techniques
  • Avoid storing texts in graphics. If you do, write the text in the alt attribute of the image tag
  • Use an automatically generated Sitemap.xml which contains all pages of the site according to XML Sitemap standard to submit them to the search engines. Take advantage of the possibility to register with the Google Search Console, as well as in the robots.txt
  • Place a robots.txt in the root directory of the domain. There you can exclude files, folders and patterns of files from indexing, and specify the XML sitemap
  • The error page must provide the HTTP status code 404, so that the pages that no longer exist are also removed from the index of the search engines

Avoid duplicate content

The golden rule is: Each content may be present exactly once on the internet! So every page should only be accessible under exactly one URL!

If this cannot be guaranteed, it is best to use:

  • <meta name=”robots” content=”noindex,follow” /> can be used in the page header to exclude duplicates or other pages from indexing
  • <link rel=”canonical” href=”eindeutige-url.html” /> in the header of every page with the reference to the correct, canonical and internally linked URL, this prevents a lot of problems with duplicate content.
  • ATTENTION! Careless use of the Canonical tag has a devastating effect on your rankings!
  • 301 redirects if you have multiple domains that are supposed to display the same content. Only 301 redirects are SEO compliant and take LinkJuice with them!


A clean information architecture and the resulting URL structure is the be-all and end-all of any good site. Be sure to follow these points:

  • Use Speaking URLs that map page titles and keywords
  • Avoid GET parameters in URLs, such as &id=2
  • A good URL lives forever. URLs should never change! If this becomes necessary -> 301! Otherwise they might give away valuable link power or lead visitors to a 404 error page

Internal linking

  • Use breadcrumb navigation. This offers users and the spider the best possible orientation
  • Give the editors the possibility to link specifically from the text. The links in the continuous text are the strongest!
  • All links should have a title attribute that is similar to the link text or can be assigned individually
  • Generate a user sitemap, which is centrally accessible and displays all pages at a glance. This allows shorter distances for the spider and overview for the user
  • JavaScript links and functions must always be stored with an indexable fallback
  • In the CMS there should be an option to add the rel=”external nofollow” attribute to links in order to invalidate links

Integrating keywords

  • The page titles of each page must be manually maintainable and individual. The most important keywords of each page must be stored here
  • Exactly one H1 heading should be stored per page and should be able to be maintained manually. The most important keywords should also be integrated here.
  • File names of images should be kept after the upload. So you can name the files according to the pattern ‘keyword1-keyword2.jpg
  • For all images an alt attribute with alt text including keywords must be available.
  • The link text for the navigation points can be the same as the name of the breadcrumb, but must be maintainable independently of the page title. The title attribute should also be maintainable manually.
  • Provide the CMS editors with the tags h2-h6, ol, ul, p, q, blockquote and the tags em and strong


Metadata is no longer as important as it was a few years ago, but it is very suitable for increasing traffic by optimizing the click rates in the search results:

  • If possible, create a unique and appealing description for each page. If it contains the search word, it will be displayed on the shelf as a snippet in the search result and can animate users to click on it.
  • The meta tag ‘keywords’ can be ignored, this is hardly noticed by any search engine
  • The meta tag ‘robots’ should always be set to ‘index,follow’, except for pages that you want to exclude explicitly. There you use ‘noindex,follow’.
  • The meta tags for content and language and character encoding should be set correctly

Keep page load time as short as possible

The page load time is a ranking factor, therefore:

  • Try to keep the page load time, i.e. the file sizes and the number of server queries as low as possible
  • JavaScript and CSS should be combined in central files and outsourced externally
  • Use GZIP compression if possible
  • Many small icons or background graphics can be combined as CSS sprites in one file
  • Use caching mechanisms and deliver static pages if possible

So, these should be the most important points for search engine optimized templates in content management systems.

For questions, suggestions, discussions and improvements I am always open and grateful. Links and mentions, as well as recommendations and free beer are also gladly accepted.

Kristin Eitel

Kristin Eitel