Most Common Website Configuration Errors

Discover the most common errors that web developers make when setting up their websites and learn how you can fix them.

We have a lot of experience with testing websites to make sure they follow best practices. Our website validation and testing tools have been used to test thousands of websites from small to large and we have noticed that some errors are more common than others. Even talented web developers can sometimes forget to optimize certain things, and that's where we come in to help.

Website Configuration Errors

This article will list the top website configuration errors that we have compiled from our anonymized records. Some mistakes are more common than others, but most of these mistakes have easy fixes that can often be implemented in a few minutes. Read on to learn about the common errors, click the associated links to learn more, and when you are ready to test your website you can use the tool at the bottom of the page to get your personalized report.

Basic Header Tags

Sometimes it's easy to overlook the simple things. Even seasoned web developers can sometimes miss the basics, and our findings show that simple but crucial HTML header tags are often overlooked. These tags are important for informing search engines and browsers about the content and structure of your site. Here are some of the most common errors that we noticed:

  • Page Title

    Every page needs a title - it's fundamental. Yet, 17% of the websites that have been tested by ValidBot were missing this most basic of HTML elements. The page title will appear in search results and in the browser tab when viewing your website, so every page needs one.

  • Page Description

    Nearly 40% of websites were missing the "description" meta tag. This tag can appear in search results and in preview boxes when your site is shared on social media. It's easy to add a page description and it can help with search engine optimization, so every web developer should add a description tag.

  • Viewport Meta Tag

    A viewport meta tag was missing for 29% of the tested websites. This simple tag tells mobile web browsers how to correctly size the page. If you have ever visited a website on your phone where the text was too tiny to read without zooming in, that was because of a missing viewport meta tag.

  • Canonical Meta Tag

    A canonical meta tag improves SEO by telling search engines which URL is the authoritative source of content. Without this tag, your site may get penalized for having duplicated content. This is because "example.com/page.html" and "www.example.com/page.html" look like two separate pages to search engines. Pick one and put it in the canonical meta tag. 54% of websites that we tested had no canonical meta tag.

  • Charset

    34% of sites do not set the character set for text displayed on the page. This can result in some special characters (emoji, accented characters, mathematical symbols, etc) being displayed incorrectly or garbled. The most common character set is UTF-8, so make sure you set this in the HTML header.

These mistakes may be common, but they're also some of the simplest to correct. So, if your website is missing one of these, take a few minutes to fix it up.

Website Redirects

Appropriate Redirects

Ensuring that your website is secure and quick to load is essential for a positive user experience. Upon connection, your server should begin delivering content without delay and via a secure, accurately formatted URL. Implementing this correctly is an area where we commonly see mistakes. Here are the most common problems that we see:

  • One Canonical Homepage

    There are several different URLs that can get a user to the same homepage of a website. For example:

    • http://example.com
    • https://example.com
    • http://www.example.com
    • https://www.example.com
    • https://www.example.com/
    • https://www.example.com/index.html

    Websites should pick one of these URLs and redirect everything else to the canonical version. 39% of the websites that we tested did not do this. Fixing this will give users a consistent experience and will enhance your search engine ranking by consolidating your rank into one URL.

  • HTTPS Upgrading

    Alarmingly, we found that 19% of websites did not enforce encrypted SSL/TLS connection to protect their user's privacy. When a user arrives via "http://" the server should automatically redirect the user to "https://" before sending any content over the wire.

  • Redirect Chain

    47% of the websites that we tested had a redirect chain that can add unnecessary delays to the page load. A redirect chain is when the website attempts to have one URL for the homepage (see above), but does two chained redirects to get there. Typically we see this when a user types "http://example.com" and the server first redirects to "https://example.com" and then a second redirect to "https://www.example.com". This causes extra delay in loading the page. Websites should be configured to do 1 redirect at most.

Consistently redirecting users to the correct, secure URL without detours is not just good practice, it's a fundamental aspect of a trustworthy, efficient online presence. Website owners should fix any of these problems to avoid confusion for users and search engines alike.

Email Autnentication Mistakes

Email Authentication

It is very common for a website to have some sort of mistake or omission in setting up their servers to securely send email. If things are not setup correctly, it can give spammers and scammers the opportunity to send an email to someone as if it was coming from your website. This can trick users into giving up sensitive and private information and it can damage your website's reputation. Having misconfigured servers can also result in your legitimate email being incorrectly marked as spam - you don't want that! In our detailed article on email best practices we go through this in detail, but here we'll just lay out the results of our tests.

  • SPF Record

    An SPF record is the first line of defense. It is a simple DNS record that says which servers are allowed to send email for your website. If a user receives an email from your website from a source not allowed in your SPF record, then it will be marked as spam. It's easy to add an SPF record, however 31% of websites do not have one, or have one that is formatted incorrectly.

  • DKIM Records

    A "DomainKeys Identified Mail" record is a way to cryptographically sign an email with your private key which can later be authenticated by email recipients to trust that the email really did come from you and not someone else pretending to be you. Nearly 40% of websites that we tested did not have a DKIM record.

  • DMARC Record

    A DMARC record is the most recent addition to email security. Surprisingly, 52% of websites do not have a DMARC record at all, and 41% of those that do are not using the strictest Receiver Policy which means that users may still receive spoofed email on behalf of your website.

  • MX Record

    Sometimes a website may not want to receive email and so it may not configure any MX records. In fact, 55% of websites that we tested were missing an MX record. For spam prevention, some email receiving servers will reject messages when the sending domain does not have an MX record, so you should set one up even if you don't plan to use it, otherwise emails that you send may not make it into your customer's inbox.

SPF, DKIM and DMARC records may sound intimidating, but they are straight forward and easy to implement. To get you started, we have created an Email Wizard that can make the records for you. All you need to do is copy and paste them into your DNS provider. Even if your website does not send email, you should still have these records to make sure that nobody else can falsely send email for you.

Bots crawling websites

Bot Files

robots.txt, sitemaps and manifests

User's come to your website to shop, get entertained or find useful information, but they aren't the only visitors. Bots also visit your website to gather information for search engines and other purposes. If you want your website to be visible to search engines and if you want to optimize how bots crawl your content, then you should setup a few files used behind the scenes by bots.

  • Robots.txt

    A robots.txt file tells visiting crawlers how they should access your website. You can tell bots to ignore certain content, you can set rate limits and you can even block certain bots. Even a mostly blank robots.txt file has value, yet 26% of websites did not have one, and for those that did 37% of them contained some sort of error which could cause bots to be unable to read the file.

  • Sitemap

    A sitemap is an XML file that lists all of the URLs for your website. It can help search engines crawl your website and improves the chances of your pages getting listed in search results. It helps websites attract customers, however 40% of websites don't have one, which is surprising. If your website has content that you want indexed by search engines, consider adding a sitemap.

  • Manifest File

    81% of websites that we tested did not have a manifest file. A manifest file is an XML file that tells bots and web browser the preferred name of your website, it's color scheme and the location of an icon that can be displayed for your site. It can also declare other useful things. There is no downside to having a manifest file, and only upside, so every website should have one.

Make sure that your website has a properly configured Robots.txt file, Sitemap and Manifest so that your presence in search engines, social media, and in the user's browser can be optimized. It's good practice and good business sense.

Security Headers

Security Headers

Securing your users' data and privacy isn't just a courtesy, it's a fundamental aspect of web development. The first step to doing this correctly is to set certain HTTP headers in your server's responses to web browsers. Properly implemented security headers are a first line of defense against various online threats.

  • HTTP Strict-Transport-Security

    An alarming 58% of websites that we tested did not use Strict-Transport-Security. HSTS is a simple header that tells the web browsers to always use encryption when accessing your website (https instead of http). All websites should use this to make sure customer data is encrypted.

  • X-Frame-Options

    Over half of websites did not set a X-Frame-Options header. This simple header prevents your website from being embedded inside an iframe in other webpages, which protects your users from "click-jacking" attacks where attackers can trick users into clicking something harmful.

  • X-Content-Type-Options

    46% of sites did not have a X-Content-Type-Options header set to "nosniff". Setting this can prevent maliciously crafted code from being executed by the website via a MIME type confusion attack.

  • Content-Security-Policy

    A Content-Security-Policy was missing for 85% of websites that we tested. A CSP header helps protect website users from certain types of attacks (Cross Site Scripting, Data injection, etc) by telling the browser which sources of content should be trusted. If you need help crafting this header for your website, try our CSP Generator which makes it super easy.

Understanding and implementing security headers can seem daunting, but it's a critical step toward fortifying your site. You can learn about more HTTP headers that protect your users by reading our in-depth article on the subject. Just copy and paste a few lines onto your server to increase your webstite's security.

User Privacy

Permissions / Feature Policy

A Permissions Policy (previously named Feature Policy), is a mechanism by which web developers can disable certain browser functionality that it does not intend to use. This can enforce best practices, and improve security and privacy for users, especially if the website uses 3rd party scripted content.

For example, if a website does not indent to use the computer's webcam or microphone it can declare this with a Permissions Policy and the browser will disable these hardware features. If the website is running 3rd party software, such as advertising, this will prevent any of those ads from using the camera and microphone as well. This allows a website to enforce policies on untrusted content.

To be fair, it is still somewhat new and it has already been rewritten and renamed, so that may be why it hasn't been more widely adopted. The older "Feature Policy" syntax (created in 2018) is widely supported by web browsers, but the newer "Permissions Policy" syntax (created in 2021) isn't fully supported yet.

We recommend that web developers add both of these headers to their website, even if only to disable the camera and microphone (assuming you don't need those hardware features on your website).

Website Speed

Website Speed

Websites that load quickly provide a better user experience. They reduce bounce rates and increase engagement, as users are less likely to abandon a site that loads swiftly. Good scores in Core Web Vitals, such as Cumulative Layout Shift (CLS), First Contentful Paint (FCP) and Largest Contentful Paint (LCP) are critical as they directly impact the perceived responsiveness of a page. For web developers, optimizing these metrics is essential for user satisfaction and SEO purposes. Search engines prioritize sites that load quickly, which can lead to higher rankings and increased organic traffic.

  • Page Speed

    For the websites that we tested, 22% of them received an overall score of "Slow" and 46% received a score of "Medium". It can be challenging to get into the "Fast" category, but every web developer should strive for this. Try optimizing your images, fonts, javascript and stylesheets. You can also lazy load anything that isn't necessary for the first display of the page.

  • Cumulative Layout Shift (CLS)

    CLS measures the visual stability of the webpage by tracking how much elements shift around as the page loads. 26% of the websites that we tested received a poor score on this test, meaning that users will see elements of the page bounce around, which makes it difficult to read text and provides for a poor experience.

  • First Contentful Paint (FCP)

    Nearly two thirds of the websites that we tested had a poor FCP score. This measures the amount of time that it takes from when the page first starts to load to when it is able to display anything other than a blank screen. Web developers should aim to get this time under 1 second. Anything over 3 seconds is considered to be poor. Web developers will need to optimize how page resources are loaded to improve this score.

  • Largest Contentful Paint (LCP)

    35% of websites have a poor LCP score. This metric measures the time it takes for the largest chunk of content to display (usually the first image on the page). This is likely when the user perceives that the page has loaded, even if additional resources continue to load in the background. Optimizing this will improve how users perceive the website's speed.

Improving Core Web Vitals scores is important for web developers aiming to build sites that provide a good user experience. It can be hard work, but prioritizing these performance metrics can ensure that a website delivers the seamless and engaging experience that users expect, while also improving their SEO standing.

Favions and Icons

Icons

Every website should have an icon to help visitors easily recognize and remember your website. It can be a logo, or some other image that represents the website. These icons (sometimes called favicons or shortcut icons) are used by browsers, operating systems, social media and search engines to identify the website and add some visual flair. Having appropriate icons is an essential aspect of a website's identity and branding and helps foster a stronger connection with your audience.

  • Favicon File

    Surprisingly, 57% of websites did not have a simple "favicon.ico" file. This file was designed to enhance bookmarks in the web browser but is also used for the URL bar, browser tab and in other places. Websites without a favicon may appear with a blank icon.

  • Favicon Sizes

    The Favicon file format is actually a container that holds several icons at different sizes. The recommendation is to include both 16x16px and 32x32px sizes. More than 60% of websites are missing one of these sizes.

  • Larger Icons

    Larger icons come into play on various platforms, from Android homescreens to desktop shortcuts. Unfortunately, 88% of websites do not include links to these larger sizes. The 192x192px and 512x512px icon sizes are highly recommended as it allows user devices and other services to represent websites with high quality imagery. For example, if a user adds your website to their Android homescreen, it can use the larger images as a nice icon. Without this, the website may be represented by a white square or pixelated screenshot of the homepage.

  • Apple Icon

    More than 90% of websites are missing an Apple Icon. These are used by iOS devices to represent your website when someone adds it to their homescreen as a shortcut. It is such a common file that other companies aside from Apple use it as well.

Curious about crafting the perfect set of icons for your website? Dive into our comprehensive article on Favicons where we explain all the different uses. We'll walk you through the different sizes and formats, ensuring your website shines in every context. It's time to give your website the iconic identity it deserves.

ValidBot can test your website

Test Your Website

Want to see how your website fares against the tests mentioned above? Type your domain into this box and ValidBot will run it through our gauntlet of tests and give your website a score. Then you can learn how to improve your website so that it is more efficient, secure and useful to users.