After the Redesign: Maintaining Web Traffic Through Online Marketing

Constant redesign is a necessary aspect of making sure your website can keep up with the demands of an ever changing online world.  With constant updates and new ways of looking at your online marketing presence, you can stay relevant and competitive in your local market and beyond.  But with every change comes the risk of losing organic traffic that you were already attracting. 

Here are a few ways to avoid this common consequence of redesigning your website and keep current traffic while also attracting new leads.

Changing Algorithms

Google changes their search algorithms constantly, often more than once a day.  So when seeing traffic losses that seem to coincide with your website relaunch, check potential issues that could be part of an algorithm update and compare with your own results.  Cross-reference the organic traffic results through multiple search engines to see if the losses are present throughout.  If significant decreases are shown in only one search engine, then consider the change a result of an algorithm change and not the fault of your redesign. 

301 Redirects

A frequent issue with a website relaunch occurs when a new URL is part of the update.  Search engine robots will continue to send searchers to the old URL through their current index, even though it no longer exists.  301 redirects act as “change of address cards” for the robots so that links can be automatically directed to the new site.  Without these redirects, search engines have no way of finding the page and eventually the site will be dropped from the index altogether.  This redirect keeps all of the links that currently exist and the stream of traffic connected to your site through those links. 

The Robots.txt

While sites are being developed and approved, the robots.txt file will work to differentiate between pages that should be indexed and those that should be avoided.  If your site appears in this file to be still under construction, it may be excluding your site from its index, even after the page has been moved to the live server.  Open a browser and input your domain name into the address bar followed by /robots.txt.  This opens the robots file, where you can see the selected “disallow” options for your site.

Page Blocking

Not only can search engine robots block pages through the robots.txt, individual pages can also block robots.  While you may want to block certain pages from search engine indexing, the majority of your pages you will want available.  View the page’s source code to check for a robots meta tag in the head area of the page, such as this one:

Any traffic loss can be a hardship on your newly updated site, but don’t panic if you see decreases shortly after relaunching, and work to resolve these issues quickly.  Improve your online marketing and follow up by checking a few common mistakes that can happen during a major overhaul.  

Scroll to Top