Multi CMS SEO

In what CMS is this website built?

That is one of the top 5 questions an SEO asks when starting looking at a website.

Most of the time, the answer is WordPress. So you can simply "do SEO" by installing a plugin (If you do, I hope not to choose Yoast, but that's another story) and make some quick configurations.*

*Most of the time, that kind of plugin only adjusts the essential on-page elements, creates a sitemap, and shows some broken links. The cost is having a plugin big as a blue whale, damaging your performance and polluting your database, so get rid of that damn plugin and use a good SEO tool*

Sometimes, the answer is a little bit more complicated, like Drupal, Magento or Shopify, which are less used, but you still will find lots of plugins, addons, and extensive documentation and a community of developers.

On a bad day, the answer will be a Corporate CMS, like Adobe Experience Manager, Contentful, Umbraco, or Sitecore. If you are not working for a big company, it is unlikely that you will find any chance to work with some of those. Documentation is scarce and commonly not updated.

Sometimes, it is not even the (almost) most challenging answer:

It's a custom-made CMS created in <almost/already obsolete technology> with <absolutely zero documentation> and <an absolute disregard for any standard> that <only one person in the company understands>.

If you are particularly unlucky, the answer is

"no CMS at all; we have a template connected to an endpoint through an API that generates the website. There is zero chance of changing any content or front-end code because we are paying for a service that provides everything without human intervention. If you change a tiny piece of code, everything will fall apart."

A few times, what you will hear is the worst nightmare of any SEO (in my opinion):

"We have several CMS combined, working together at the same time".

This situation is more frequent than most people believe. In my experience, it happens when big websites are strongly modularized and designed not just at different times, but for different needs, teams and technologies.

Some CMS can control the post-login user interface. Another one is the e-commerce part. Another is the blog. Another manages some multiregional features. Another is for the PLPs and offers. Maybe some parts are working on subdomains or subfolders. Perhaps you see a different domain kind of heavily integrated with the main one (in the menu and internal links): it's because someone decided -with some good judgment- that it was easier to create something in a completely new domain than trying to put another block in the Jenga.

So, what can you do in this fucked-up scenario?

Before going nuts, there are a few things you can do

Of course, you cannot just recommend putting everything in one CMS. That could be -in theory- convenient for you but an expensive clusterfuck for everyone else.

In my opinion, this is one of the most complicated scenarios. You need to have a lot of information about High-level SEO. If you think you can win this by talking about duplicated titles or canonicalized URLs, think again.

Run several audits, and try to get the best possible understanding of the situation. These websites are huge, and you will likely need to use a cloud-based crawler. If you cannot afford one, don't run a full audit (your machine maybe won't handle it), but several ones, trying to find the primary pain points for CMS/Section. The good news is that you will have months of work, and in this economy, that is a good thing.

Fix the fundamentals: Sitemap & Robots.txt. Those control the whole domain and can be an excellent place to start.

Document everything: What server is running what, through what CDN, for what users. You will need to create a complete infrastructure map and try to figure out how Google is reading the site. This is critical for websites with personalized experiences (with content that changes by cookies or geolocalization).

Look for high-impact solutions. You may never end fixing broken links and manually adjusting metadata, so don’t do that. Optimizing the CSS/GZIP/CDN/Sitemap/Robots often impacts hundreds, thousands, or millions of URLs simultaneously.