WordPress’s REST API was built for just this type of use. Magento has enthusiastically embraced Progressive Web Apps for eCommerce front-ends. Progressive Web App frameworks are available for WooCommerce.
But as browser support has improved, developers, especially library developers, are releasing production code that uses a lot of ES6 features, and if web developers are careful, they can introduce code that Google’s crawlers don’t understand.
Three Solutions To Google’s Language Barrier
There are three basic solutions to Google’s non-comprehension of ES6:
Don’t release ES6 code to production: transpile it to ES5 or avoid it altogether.
Stick to traditional server-side rendering with Node, PHP, or other server-side languages.
Render the initial load of a client-side application on the server. This is a popular choice and frameworks like Next.js make it fairly straightforward. The initial view, content included, is rendered on the server and sent to the browser. Thereafter content is loaded by the client-side application in the traditional way.
Google has recently introduced a new option: dynamic rendering. With dynamic rendering, content sent to the web crawler is rendered on the server, but content sent to browsers is rendered on the client. If that sounds like a breach of Google’s rules against sending different content to browsers and crawlers, that’s because it is. But Google is making an exception because it recognizes that its crawlers are holding developers back.
It is likely Googlebot will lag behind browsers for the foreseeable future, so if you want to embrace Progressive Web Applications for your business or eCommerce store, it’s worth taking the time to understand exactly what the crawlers understand.
Page speed is linked to almost every aspect of your website. From SEO to conversions, to user experience and beyond. A faster site usually means better results across the board. Yet improving site speed can be a complex process. From the hosting provider you choose to the platform you decide to work with, there are a lot of areas that can be tweaked to improve site speed. This article is about website optimizations you can implement yourself.
But first, how do we know site speed is important?
On top of this, it was becoming increasingly clear that site speed also had a large effect on conversions. For eCommerce businesses, the statistic that a 1 second delay in page response can lead to a 7% reduction in conversions is often cited. Many users expect pages to load in 2 seconds or less, bouncing if load times take too long.
This article will look at some simple website optimization tools, tips, techniques, and more. Starting with how to test your site for speed, and then providing actionable and easy to implement website optimizations.
The first thing you’re going to want to do is to test your website’s current speed. This will help to give you a snapshot for before and after benchmarks during the optimization process.
A couple of website optimization tools you can start with are Webpagetest.org and Google’s Page Speed Insights. Make sure to run multiple tests and then average the results. Web Page Test will allow you to do this automatically in the settings menu. We also recommend selecting a testing location near to where your site is hosted to get the most accurate results.
The example above shows how we usually set up webpagetest.org for basic speed checks.
Page Speed Significance
Below you can see the results of a page speed test on the Magento demo site averaged out. We’ve picked out 3 of the most important and significant stats we want to use for optimizing the site.
Time To First Byte
The first stat we’ve taken is load time. This shows us the complete time it takes to load our page. Remember, if a page takes over 3 seconds to load, you may be losing half of your potential traffic. This number is the most important for us to change.
The second stat is Time To First Byte (TTFB). We’ve talked about this previously and discussed how this is a generally overused statistic. While it can help to provide guidance, TTFB can be manipulated relatively easily and isn’t as important as some may think.
The third stat is the size of information being downloaded to the page. If this number is very large, it may be useful to take a closer look at what different page elements take what durations. You can do this by taking a peek at the waterfall.
Page Speed Waterfall
An example of what the waterfall looks like can be seen below. Here we’re able to isolate what elements are slowing down site speed the most.
Once you’ve looked at the waterfall, you should have a better idea of what can be improved. Above, we can see that some of the front end .js files can likely be sped up slightly. Lower down the waterfall (off the page), there are also some image files that take longer than many of the page’s other image elements. These may also be an area we can look into and improve on.
Image compression is one of the easiest ways to optimize site speed. Too often do web designers make images that adopt uselessly high resolutions. High-resolution images take up more storage space on a server and can increase load times significantly.
We highly recommend scaling images appropriately. If an image is only going to take up a 100 x 100 pixel space on your site, there’s no need to make it 1000 x 1000.
Image Type Extensions
When saving images, deciding on what extension to use generally falls into one of two different options: .jpeg and .png.
There’s a lot of false information out there about which extension should be used. Many say that .png (Portable Network Graphics) is the better option, with it being designed to compress images as much as possible without losing quality.
Before settling on a file type, it can be beneficial to check how saving as each type affects file size and quality. If you notice a clear difference, opt for the better extension.
There are several ways to improve image optimization. You can use a third-party piece of software and do this yourself, or some CMS’ offer internal functionality, plugins or extensions to manage this for you. The EWWW Image Optimizer WordPress plugin is a great tool for getting started if you’re running a WordPress site.
Google recommend testing Gzip compression before implementing in a production environment. We recommend making use of a Dev Site environment that mimics your production environment if you’re planning on doing this.
Also note that there has been some evidence that compression can increase Time To First Byte durations – despite reducing overall page speed load times. Some SEO authorities have suggested that Google may, in fact, prioritize TTFB over overall page load times. Because of this, we recommend testing on single pages before making the switch sitewide.
When it comes to page load speed, less is almost always more. Instead of adding additional functionality to core pages, how about settling on something simpler and faster?
The fewer HTTP requests a site has, the faster it will load (usually).
In addition to improving site speed, simple web design has also been shown to improve user experience in many cases. In a UX study conducted by Google, it was found that users tend to judge a website’s aesthetics within 1/50th – 1/20th of a second, and that visually complex sites were almost always judged as being less beautiful than their simpler counterparts.
The more beautiful a website is perceived as, the better UX and SEO will be, and the more conversions will increase.
A simpler website design is one of the quickest methods for improving page speed within a short period of time. However, we recommend running A/B tests in order to see how changes actually perform and not making a 100% change straight away.
Caching is where repeat visitors are able to load your site much faster thanks to page elements being stored on their hard drive in a cache or temporary storage.
For WordPress and WooCommerce sites, we make use of W3 Total Cache to manage caching functions. This will come pre-installed and pre-configured when you purchase a WordPress optimized hosting solution.
In order to optimize site speed even more, the Hostdedi Cloud allows for use of the Cloud Accelerator. This can easily be turned on and off with the click of a button under the performance section of the Client Portal.
Caching with a CDN (Content Delivery Network) in place is a more complicated process and can require advanced setup. However, a proper caching configuration with a CDN can help you to reach that global audience as though you were with a local host.
You need to ask yourself: What page are you optimizing? Homepages are an important part of a website, but what other pages are users going to be interacting with a lot? All of these pages need to be speed checked and optimized.
For instance, it’s not good enough for us to just optimize Hostdedi.net, we also need to optimize Hostdedi.net/magento/hosting and Hostdedi.net/cloud/hosting.
Before you set out to begin this process, try to put together a plan for what key pages are drawing in the most conversions and attracting the most ROI.
Page Speed and SEO
We said earlier that page speed and SEO are heavily linked. However, we also mentioned that page speed is – by far – not the most important factor in determining rank. Google has said that if a page or site’s content is more relevant and people are willing to wait for it to load, then they will not penalize that site.
Page speed is an important part of optimizing, but content, quality, and user experience should always come first.
One of the things we admire most about WooCommerce is its rich out-of-the-box functionality. A new eCommerce retailer can start selling in next to no time. They can focus on adding products and configuring their store without needing to install an array of extensions to add essential features.
But including every possible feature would result in a messy and bloated application, which is why WooCommerce also provides a way to add extensions that bring new tools, integrations, and features.
Even during the setup process in WooCommerce, you are given the option of installing additional extensions to add functionality to your WooCommerce store. We highly recommend new eCommerce merchants to browse the available WooCommerce extensions to get a feel for what’s possible.
In this article, we’re going to highlight six popular plugins thatvWooCommerce professionals shouldn’t be without.
What is the difference between a WooCommerce extension and a WooCommerce plugin?
In reality, nothing. Both are used interchangeably to refer to something that adds extra functionality to a WooCommerce store or a WordPress site. Plugins likely came into effect due to the use of the term for adding WordPress functionality, while extensions is used by WooCommerce to refer to plugins that only influence WooCommerce.
Product Filter for WooCommerce gives customers extra options for filtering and sorting products. Products can be filtered by price, category, color, size, availability, and many other factors. The filtering is responsive and intuitive, and is fully customizable by the WooCommerce store owner.
Product filter is a great addition to a store with more than a handful of SKUs, as it allows for store owners to make the user experience as streamlined as possible. Remember, the better UX on your store, the higher conversions will be.
Customers have lots of reasons for putting products in the cart and some have no intention of buying, but a significant proportion of abandoned carts can be “saved” if the retailer contacts the customer to remind them or send a relevant offer.
Abandoned Cart Lite is a simple extension that will email notifications to shoppers to remind them about orders that aren’t completed. Abandoned Cart Pro — the extension’s premium version — includes the ability to add unique coupons to the emails.
Have you ever found yourself feeling as though you don’t have enough options when it comes to customizing your products? Is there a field you don’t see in stock Woocommerce that you think should be there?
WooCommerce Extra Product Options extends the range of product options available to WooCommerce retailers. Additional product options can be added via checkboxes, radio buttons, date pickers, and forms, depending on the needs of the retailer.
Do you want to set custom rules for pricing on your products?
Not all stores follow a one size fits all approach and trying to customize multiple price points for a single product in stock WooCommerce can be a challenge.
WooCommerce Dynamic Pricing & Discounts is an all-in-one solution for price and discount management. It can be used to create sales, bulk pricing, BOGOF offers, member pricing, loyalty programs, and more.
Manual collection of data can be so much easier. Instead of spending a significant amount of time taking data out of your WooCommerce store and entering it into a spreadsheet, why not use a Woocommerce extension that adds that functionality for you?
We are big fans of automation. Running an eCommerce store of any size is a lot of work, and much of that work involves moving data from one service to another.
Zapier is great for connecting WooCommerce to the other tools you use to run your business, including marketing tools, spreadsheets, and accounting platforms.
WooCommerce Google Analytics does exactly what the name suggests, allowing merchants to leverage the power of Google Analytics to track a variety of eCommerce-related metrics, including cart actions, product views, and user journeys.
Reaching an international audience usually has one large barrier to entry: language.
Whether it’s trying to reach an audience halfway around the world, or just next door, if they can’t understand your page content, they’re not going to get very far.
WooCommerce Multilingual helps to bridge this gap with automated multilingual functionality. Content created in your first language is translated into the user’s language – detected through their browser – and maintained through the entire purchasing process.
WooCommerce Multilingual also adds the ability to manage multiple currencies in conjunction with multiple languages. A great way to start expanding your eCommerce business quickly.
Something that often goes overlooked by new store owners is that of invoices and packing slips. This little addition can be the difference between looking like a professional store and something a little more amateur.
For some first-time store owners, invoices and packing slips is another area where time can easily be saved by letting an extension manage the process for you.
WooCommerce PDF Invoices & Packing Slips allows you to generate and attach invoices to emails or prepare for printing with just the click of a button. Added functionality and saved time.
The Best WooCommerce Extensions
Are there any Woocommerce extensions or plugins you think we’ve missed? There are hundreds of WooCommerce extensions for retailers to choose from, so feel free to let us know about your favorite extensions in the comments.
How does a browser load a web page? It uses a phonebook. Not an old-fashioned leatherbound book or a switchboard operator, but a service known as DNS. Each page of that DNS “phonebook” is what are known as DNS Records.
In other words, when you look for nexcess.net, your computer looks in the DNS “phonebook”, finds the number for the site, and connects you to it. Of course, the whole process is much quicker, and faster, than this.
This article looks at what DNS records are, the different types you’ll find, and why they’re incredibly important for the success of any website.
It was 1983. The internet was young and IT professionals had begun to get fed up with having to remember long series of numbers in order to connect with other machines. Networks had spread beyond just a few units and in an effort to future-proof, longer series of numbers were proposed. There was just one problem, how to make these numbers more consumer friendly?
Paul Mockapetris published two papers on the subject, creatively named RFC 882 and RFC 883. Mockapetris’ system expanded prior use of a hosts.txt file into a large system capable of managing multiple domains in a single location. That system is known as DNS, or Domain Name System.
Without DNS, the Internet wouldn’t be what it is today. We may even need a Roladex to visit our favorite sites!
With DNS, computers still require the IP (internet protocol) address number sequence in order to connect with a server. Yet with over 4,294,967,296 different IPv4 addresses, it makes a lot more sense to convert those numbers into something more easily recognizable.
DNS gives IP addresses unique names for computers, services or other resources that are either part of a private network or part of the Internet.
The Hostdedi DNS network has 100% uptime with multiple redundancies in place
The domain name system prevents having to remember a long series of numbers. Users are able to type in a domain name and then the domain name system will automatically match those names with the IP address and route connections.
At the center of all this, the hosts.txt file still existed in the form of vast servers for managing domain names and at the heart of these servers are DNS records.
IP addresses work in a similar fashion to that of street addresses or phone numbers in an address book. While people browse the Internet, they look up their favorite site much like they look up a friend’s number. From there, the system provides them with the friend’s number and they can contact them. With DNS, the second part of this sequence is automated. This requires DNS records from a DNS server.
During the creation of DNS, servers were manufactured solely for the purpose of managing DNS and related information. Within each of these servers are DNS records that tie entries to a domain.
Any device connected to a computer network, whether it is a PC, router, printer, or any other device with an IP address, is referred to as ‘hosts’. With the sheer number of ‘hosts’ around the world, engineers needed a way to track devices without resorting to memorization of numbers.
As explained earlier, DNS records came along with DNS as a tool for system admins and users to seek out authoritative information on websites or other services they’re trying to access.
There are two types of DNS Records. These are:
Records stored in Domain Name System servers
Records stored on a user’s machine
Records stored on a Domain Name System server are covered in more detail below, including what types of records exits and how they function.
Records stored on a user’s machine are also known as DNS cache. This record lists the visiting history of an operator for all websites previously visited, regardless of whether they were attempted visits or not.
When you watch a crime drama and a culprit’s computer is taken to be analyzed for the sites they have visited, a DNS cache is usually what would be checked for unauthorized activity.
However, a DNS cache is usually temporary and has a limited lifespan before being removed.
DNS Syntax Types Explained
While there are an abundance of record types in existence, below you’ll find nine of the most commonly used DNS records. For more information, don’t forget to check our DNS Records knowledge base, as well as how to configure DNS records for your site. A – A records are usually referred to as address records, and occasionally host records. They are the most commonly used records that map hostnames of network devices to IPv4 addresses. A website address book. AAAA – Serves the same purpose as A records, except that hostnames are mapped to an IPv6 address vice an IPv4. As opposed to 32-bits for an IPv4 address, an IPv6 address contains 128-bits. An example of an IPv6 address is FE80:0000:0000:0000:0202:B3FF:FEIE:8329. CNAME – Acts as an alias for domains. The CNAME record is tied to the actual domain name. If the address nexcess.net was typed on your internet browser it would reload to the URL www.nexcess.net MX – MX records maps a domain name and connects them with message transfer agents. A mail server is responsible for managing the reception of emails, and preference values are assigned. In the case of large organizations, multiple email servers would be utilized to process messages en masse. Through the use of the SMTP (Simple Mail Transfer Protocol) emails are routed properly to their intended hosts. NS – Also known as name server records; designates a name server for a given host. PX – The technical description based on RFC 2163 details the PX DNS record as a ‘pointer to X.400/RFC822 mapping information’. Currently, it is not used by any application. PTR – Referred to as reverse-lookup pointer records. PTR records are used to search names of domains based on IP addresses. TXT – A type of DNS record that stores text-based information. It’s primarily used to verify the ownership of a domain as well as hold SPF (Sender Policy Framework) data, and prevents the delivery of fake emails that give the appearance of originating from a user. SOA – Possibly the most critical one of them all, the State of Authority record annotates when the domain was updated last.
The general purpose of a DNS lookup is to pull information from a DNS server. This is akin to someone looking up a number in a phone book (hence the term ‘lookup’ in conjunction with DNS).
Computers, mobile phones, and servers that are part of a network need to be configured to know how to translate domain names and email addresses into discernable information. A DNS lookup exists solely for this purpose. There are primarily two types of DNS lookups: forward DNS lookups and reverse DNS lookups.
Forward DNS Lookups
Forward DNS allows networked devices to translate an email address or domain name into the address of the device that would handle the communications process. Despite the transparency, forward DNS lookup is an integral function of IP networks, in particular, the Internet.
Reverse DNS Lookups
Reverse DNS (rDNS/RDNS) pulls domain name info from an IP address. It is also known as Inverse DNS. Reverse DNS lookups are used to filter undesirable data such as spam. Spam can be sent through any domain name that a spammer desires. Spammers can use this technique to fool regular customers into thinking that they’re dealing with legitimate entities. This can include organizations such as Bank of America or Paypal.
Email servers that are receiving emails can validate them by checking IPs with Reverse DNS requests. RDNS resolvers should match the domain of the email address if the emails themselves are legitimate. While this is useful in verifying the integrity of emails, it does not come without a cost. An ISP has to set the records up if the legitimate mail servers themselves do not have the appropriate records on hand to respond properly.
What Are Your DNS Records?
You can check your own DNS records with the Hostdedi DNS Checker. Simply enter the site address you want to check and the type of record you want to see.
You can also use this tool to check third-party DNS records and confirm the identity of certain domains to make sure they are not fake.
Ultimately, DNS makes life easier for the end user that can’t memorize 32-bit or 128-bit IP addresses. It’s easier to just type a name into the browser bar and let DNS figure out the rest. DNS resource records are fundamental for DNS to be able to work, and the Internet wouldn’t be what it is today without them.
If you’re looking for more information on site performance and benchmarking, don’t forget to check our article on TTFB (Time To First Byte) and why it may not be as important as you’ve been led to believe. Also, check out our summary of data center tiers and use the stats to figure out which data center tier you’re hosting with.
Time To First Byte (TTFB) is the time it takes for a web server to respond to a request. It’s a metric reported by several page speed testers, and is often quoted as a primary means for measuring how fast a site is. The idea being that the faster a web server responds, the quicker a site will load.
However, numerous groups have found that TTFB isn’t that important. When looked at in isolation, the figure provides an appealing way to grade your site or hosting provider, but when looked at in conjunction with other metrics, there seems to be a disconnect. This is especially true with regards to SEO rankings and improved user experience.
Here, we’re going to look at why TTFB can be easily manipulated, what metrics actually matter, and how knowing these things can help you to improve your site’s SEO, user experience, and more.
TTFB measures the time between a user making a HTTP request and the first byte of the page being received by the user’s browser.
The basic model of how TTFB works
The model is simple. The faster a web server responds to a user request, the faster the site will load. Unfortunately, things get a little more complicated.
In some cases of testing site speed, you’ll find TTFB test durations far longer than what you would expect. This is despite actual page load times seeming much faster. This is the first indication that something is wrong with how TTFB measures speed.
A deeper look shows that this is because TTFB actually measures the time it takes for the first HTTP response to be received, not the time it takes for the page itself to be sent.
A test of Time To First Byte and page load times
In the Time To First Byte test above, TTFB is measured at 0.417 seconds, which seems very quick. However, looking at the waterfall, we can see that this figure only correlates with the HTML loading time. Afterward, page load speed takes much longer for other assets on the page and we’re seeing DOM content loaded at around 1.6 seconds.
This is because the TTFB value is incredibly easy to manipulate. HTML HTTP response headers can be generated and sent incredibly quickly but they have absolutely no bearing on how fast a user will be able to see or interact with a page. For all practical purposes, they are invisible.
By loading HTTP response headers to speed up TTFB, it’s easy to create a ‘false’ view of a site’s speed. It also doesn’t necessarily mean that the rest of the waterfall will load quickly as well.
A good example of how Time To First Byte testing can be manipulated with HTTP headers is when looking at the page load times of NGINX in conjunction with compression.
Compressed pages are smaller and so they download from a server faster when compared with uncompressed pages. This ultimately means that page loads times to interactivity are much faster. However, from the perspective of TTFB, this is not true.
Time To First Byte compared with actual page loading times
This is because HTTP headers can be generated and sent relatively quickly before the main page content.
This is an especially significant figure for those that make use of the Hostdedi Cloud Accelerator, as this makes use of NGINX in order to speed up caching speeds on optimized Hostdedi platforms.
Continue reading to find out what metrics you should be using to check page load times.
In a 2013 study by Moz, it was found that Time To First Byte does have a significant correlation with SEO rankings. The faster TTFB was, the higher ranked pages would be.
This being said (and as Moz themselves make clear) correlation and causation are not the same thing. The actual methods Google (and other search engines) use to crawl web pages and build out SERPs are not known to the public.
It’s been deemed by many that page load times to interactivity are actually a lot more important. When looking at page speed tests, it’s important to look at all the figures available as a whole and not just TTFB.
So, with regards to TTFB tests, SEO, and user experience:
Google Does Not Measure Page Speed for SEO (Entirely)
Ok, it sounds like we’ve gone back on what we just said, but bear with us.
Google doesn’t measure page speed as incredibly important, it measures user behavior. They have said in the past that if users are willing to wait for content to load, they will not downgrade a website for being slow.
This is because Google weighs usability and experience as more important than speed. Back in 2010, Matt Cutts said that including site speed as a ranking factor “affects outliers […] If you’re the best resource, you’ll probably still come up.” It just happens to be that the less time a user has to wait for a page, the more likely they are to stay on the page.
So when it comes to using speed testing services such as PageSpeed Insights, make sure to consider your page load times from a practical perspective as well. How do you feel about the time it takes for your page to load when you type it in your browser? Do you think the content quality is worth the wait?
PageSpeed Insights provides actionable speed intel for SEO such as that above
Simple checks like this are easy and can provide you with a lot of insight into what your users will think.
Practical Page Load Times Matter – Not TTFB
A faster Time To First Byte does not mean a faster website.
TTFB is not a practical measurement. It doesn’t really affect the user experience. The time it takes for a browser to communicate back and forth with a server doesn’t affect a user’s experience of that server’s content as much as the time it takes for them to actually interact with it.
Instead, measurements that test time to interactivity are inherently more important. Improvements here don’t always match the results of web page speed tests or scores.
So, the main takeaway here? High-quality content and a great user experience are still two of the most significant factors involved in SEO, site speed can influence this but is not nearly the most important.
However, again, TTFB and page load times aren’t as important as high-quality content and usability. The user experience on mobile devices has long been a key area Google and other search engines have tried targeting and improving. Load times are just a small part of this.
Responsive design and easily readable and scalable text and images are much more important.
Google highly recommend their tool PageSpeed Insights in order to properly see how your page speed may affect SEO ranking.
Slow and Steady Wins the Race
Ok, all this doesn’t mean that you should let your site crawl to a halt. This isn’t a childhood fable or a call to reduce quick internet. Fast internet is one of the wonders of the modern age and you still want your site to load as quickly as possible.
What we’re saying is that if you’re trying to find how to improve Time To First Byte, stop.
it’s far more important for you to start looking at page load time in their entirety and not just the time it takes for a server to respond. At Hostdedi, we’re proud of how fast our data center serves content, and work our hardest to make sure that our servers are optimized for providing a great user experience and helping to boost your SEO as much as a hosting company can.
We highly recommend checking out the Hostdedi Cloud and seeing how Hostdedi can help.
A Domain Name System (DNS) links domain names to the IP addresses that are used to route data around the internet. It is an important component of eCommerce performance: if a shopper’s browser can’t find the IP address of a store’s server, it can’t send requests and load pages. If the DNS system is slow to respond with an IP address, the browser and the customer are left waiting.
Often, eCommerce retailers have little control over DNS performance. For some, this causes a problem as a performance-optimized Magento store on lightning fast hosting can still be slow from the perspective of a shopper. Latency introduced by an unresponsive DNS nameserver is just as effective as increasing bounce rate as slow loading page elements are.
We introduced Hostdedi DNS to give our clients access to a fast and reliable DNS. The Hostdedi DNS is a multiple redundancy Domain Name Service that provides low-latency name resolution for eCommerce and hosting customers located anywhere in the world. Backed by an industry-leading open source DNS record management service called Octo DNS — which was originally developed by GitHub — Hostdedi DNS also provides fast DNS record updating and syncing.
The Hostdedi DNS network is simple with 100% uptime thanks to multiple redundancies.
The chart above shows how Hostdedi DNS works. All connections routed through both Route53 and Dyn, who are connected to servers around the world (to further speed up request responses). This helps to cut out the middleman and mean that data is being downloaded from the closest located available and as quickly as possible.
What to Look for in an eCommerce DNS Solution
eCommerce store owners and website owners generally look for the same things when trying to find a good DNS. These are:
A low latency network.
High reliability with multiple redundancies and a history of 100% uptime (or as close as possible).
Flexibility in being able to update DNS records as quickly and easily as possible.
Quick DNS record updates for when changes are made. The faster the better.
Redundant DNS Providers
Hostdedi DNS leverages two of the largest and most reliable DNS providers in the world: Dyn and Amazon Route 53. We use two DNS providers for resilience: if one provider suffers an outage the other can take its place.
It is important to note that the DNS servers are not hosted in Hostdedi data centers but in the data centers of our DNS providers. We provide hosting for the Octo DNS orchestration system, which handles the updating and management of records, but the nameservers are hosted on Dyn and Route 53’s massive and redundant global networks.
Hostdedi DNS uses eight nameservers distributed across both DNS providers. Using multiple domain nameserver with multiple DNS providers offers an incredible level of redundancy and reliability. The likelihood of an outage affecting all domain servers at both providers is tiny.
The practical consequence is that your store remains accessible and responsive even if there is an outage.
Both of our DNS providers offer AnyCast DNS networks. Traditionally, a DNS nameserver was a single machine located in a data center. In an AnyCast DNS network, each nameserver is replicated across many different servers in data centers around the world. This has two major benefits: the first is redundancy; if a server fails, requests are routed to a different server.
The Dyn Anycast network is huge, with locations all around the world. Image thanks to Dyn.
The second benefit is latency. DNS requests are always routed to the nearest location, which drastically reduces the round-trip time for DNS requests. You can think of AnyCast DNS as a content distribution network for DNS records (although the technical details are not the same).
The consequence of all this redundancy and geographic distribution is a reliable low-latency DNS service that ensures your shoppers will always be able to reach your store.
Hostdedi DNS is great for eCommerce, but it is just as effective at reducing latency and increasing reliability across our full range of hosting plans, including WordPress, Craft CMS, and ExpressionEngine hosting.
The Hostdedi DNS Checker
Check your DNS and make sure it’s propagated with this simple DNS tool.
In the world of data centers, reliability is one of the most important factors. The more reliable you are, the more likely clients are going to want to use you. After all, who wants a data center that isn’t online?
Luckily, The Telecommunications Industry Association (TIA) published a standard for data centers defining four levels of data centers in regards to their reliability. The aim was that this standard would then be able to inform potential data center users about which center is best for them. While brief, the standard laid the groundwork for how some data centers would manage to pull ahead of others in the future.
This article breaks the types of data center down into the four tiers and looks at how they differ. Combine this our article on how to choose a data center location, and you’ll know where the best place to host your website is.
Check out our Infographic below to quickly see the main differences between data center tiers, or keep reading for more detail.
The Classification of Data Centers
Data centers are facilities used to house computer systems and associated components. A data center is comprised of redundant power supplies, data communications connections, environmental controls, and various security devices.
Tier one data centers have the lowest uptime, and tier four have the highest. The requirements of a data center are progressive in that tier four data centers incorporate the data center requirements of the first three tiers in addition to other conditions that classify it as a tier four data center.
The requirements of a data center refer to the equipment needed to create a suitable environment. This includes reliable infrastructure necessary for IT operations, which increases security and reduces the chances of security breaches.
What to Consider When Choosing a Data Center
When choosing a data center to store data for your business, it is important to have a data center checklist. This is a list of the most important things you should keep in mind – such as the physical security of a prospective data center – when making your choice.
Typically, a good data center checklist would include the various data center pricing policies and extra amenities provided. An excellent straightforward strategy, for instance, should have no hidden charges and a data center with additional facilities is better than one without.
Data Center Specifications
Data center specifications refer to information about the setup in a data center. This can include the maximum uptime, redundant power systems that allow the platform to stay up regardless of power outages, the qualification of technical staff at the data center, and more.
It is common that higher data center tiers have better-qualified staffing since more expertise is required to maintain the whole platform. Data center specifications should be on the data center checklist of a customer looking at prospective data centers to store their data.
What Is a Tier One Data Center?
This is the lowest tier in the Tier Standard. A data center in this tier is simple in that it has only one source of servers, network links, and other components.
Redundancy and backups in this tier are little or non-existent. That includes power and storage redundancies.
As such, specifications for a data center for this tier are not awe-inspiring. If a power outage were to occur, the system would go offline since there are no fail safes to kick in and save the day.
The specifications of a tier one data center allow for uptime of approximately 99.671%. The lack of backup mechanisms make this data center tier seem like a risk for many businesses but they can work for small internet based companies with no real-time customer support. However, for companies with heavy reliance on their data, a tier one data center would not be practical.
One of the advantages of tier one data centers is that they provide the cheapest service offering for companies on a budget.
However, a lack of redundancy means that the uptime of servers is considerably lower than tier two, three and four and maintenance on the facility will require shutting down of the entire facility thus more downtime.
What is a Tier Two Data Center?
This is the next level up after line one. Tier Two features more infrastructure and measures to ensure less susceptibility to unexpected downtime. The requirements of a data center for this data center tier include all those of the first tier but with some redundancy.
For instance, they typically have one a single path for power and cooling. However, they also have a generator as a backup and a backup cooling system to keep the data center environment optimal.
The specifications of a data center for the second tier allow for higher uptime compared to level one data centers that are approximately 99.741%.
What is a Tier Three Data Center?
Tier Three data center requirements for line three data centers include all those of tier one but have a more sophisticated infrastructure to allow for redundancy and backups in case of unexpected events that may cause downtime.
All server equipment has multiple power sources and cooling distribution paths. In case of failure of any of the distribution paths, another takes over ensuring the system stays online. Tier three data centers must have multiple uplinks and must be dual powered.
These specifications ensure you only have a maximum of two hours downtime annually, as a percentage. Some of the equipment in tier three systems are fully fault-tolerant.
Some procedures are put in place to ensure maintenance can be done without any downtime. Tier three data centers are the most cost-effective solution for the majority of businesses.
What is a Tier Four Data Center?
Tier 4 is the highest level when it comes to data center tiers. It has an availability percentage of 99.99%. A tier 4 data center is more sophisticated regarding its infrastructure as it has the full capacity, support, and procedures in place to ensure the maximum and optimum uptime levels.
Tier 4 data center fully meets all the specifications of the other three tiers. A tier 4 data center is error tolerant as it can operate normally even when there is an instance of infrastructural equipment failure.
A Tier 4 data center is fully redundant with multiple cooling systems, sources of power and generators to back it up. It has an uptime level of 99.99% with an estimated downtime level of only 29 minutes annually.
These are the four data center tiers and a summary of their data center requirements used in the design process. Anyone looking for things to put in their data center checklist could find the essential elements to look for in the specifications of a data center and requirements.
Hostdedi Is a Tier 4 Data Center
Between having an uptime of 99.9975%, multiple redundancies, and an annual downtime of less than 18 minutes, the Hostdedi data center is regarded as a tier 4 data center. If you would like to know more about the Hostdedi data center, don’t hesitate to check out the different data centers offered by Hostdedi around the world or take a more detailed look at our Southfield, Michigan data center (in an easy to read infographic).