All posts by root

Preventive Maintenance

Preventive Maintenance

HTML5 benefits for SEO

For over 10 years now HTML 4.01 has been the standard for coders around the world, but since the release of HTML5 users web experience has been taken to a whole other level of functionality. Web browsers like Safari, Chrome, IE and Firefox have taken advantage of this new web standard and have implemented many of their features with it.

Because HTML5 has many advantages for SEO companies, not many know exactly what does advantages are and how to really take advantage of all the latest features.

Here are some simple key elements of HTML5 that will help your SEO:

1. Updated Semantic Tags & Geo-location Microdata Elements

HTML5 has the structure and elements to bring better visibility to your site. Header, article, and nav tags. With Geolocation Microdata, you can have search engines index your information alot better, including reviews, events, etc.

SEO benefits: “With this cleaner code, pages will be indexed more efficiently. Pages will load much faster due to its simplicity and less call to actions from third party tools.”

2. New Media Elements

HTML5 will not need the the help of most third  parties in order to access its rich multimedia features. For instance, playing audio or video on a webpage had no set standard practice until HTML5 came into being. Generally, flash has been the most called upon third party to play video and audio files in the past, but with HTML5 say goodbye to flash little by little. Embedding video into your site and having it work perfectly with mostly all browsers is the future of what HTML5 brings to the table.

SEO benefits: HTML5 videos will be indexed by search engines much faster. Videos are ultra-fast and are browser based. Also, slower internet connections can have a better browsing experience without video buffering since the downloading and playing both take place in HTML5.

3. Canvas Element

With HTML5 canvas you can now draw charts, graphics, or go all out in a interactive experience for your site.

SEO benefits: It provides greater stability as compared to a plugin since the browser includes the Canvas.

4. Local Client Side Storage

Cookies will also be a thing of the past as we learn that holding onto cookie data is sluggish at best and not the best solution for best browser experience. HTML5 only uses data when called upon it. Data is stored and accessed using Javascript.

SEO benefits: Website’s performance becomes unaffected even after storing large amounts of data.

5. Form Elements

The forms have now been suited with more variety and functionality viz. date, time, URL, email, calendar, search etc.

SEO benefits: Better design and functionality on mobile devices.

6. Multi-tasking capabilities through Web Workers

HTML5 has made web browsers capable of multitasking through Web Workers. Javascript used in the past could not even handle heavy computing. Scripts that would run for long also used to get timed out. The features of Web Workers are somewhat similar to the traditional Java or C# multi-threaded applications.

SEO benefits: Greater flexibility to the developer in separating the content from code intended for processing tasks.

SEO and web design

SEO and web design

Search engines are limited in how they crawl the web and interpret content. A webpage doesn’t always look the same to you and me as it looks to a search engine. As technology advances and web usage evolves, so do SEO best practices. Web designers now have more choices and technologies available than ever before.

I thought it would be helpful to revisit the top SEO considerations relating to some of the latest website design trends, which include parallax, responsive, and HTML5 design.

Although I am a huge fan of incorporating all three design choices when appropriate; in all cases, site architecture and accessibility remain the primary SEO concerns.

Most people know that designing with the end user in mind also helps improve SEO rankings. There are technical reasons why this is the case and why it’s of paramount importance to design with the user in mind. This article focuses on this concept in more detail, and I provide actionable next steps for SEO professionals to consider when thinking about website interaction and user experience design.

Website Design Trends

Parallax design puts web content on one page, at the user’s fingertips, and can be a great way to lead a consumer through a storytelling process. Every site I’ve worked on that has adopted some type of parallax storytelling design has seen improvements in conversion rates.

Responsive design is Google’s recommended method of designing for multiple devices. There are tremendous user experience advantages to adopting responsive web design concepts that allow your website to perform optimally for multiple devices.

In addition to the user experience benefits of responsive web design, the primary benefit for SEO is that it does not dilute your link equity. In other words, responsive web design gives you one URL for both your mobile site and your main site, which means that you are more likely to do a better job increasing your external backlink count to each page versus having to drive links to two separate URLs.

HTML5 has been touted as the next big thing in web design, but implementation can be troublesome for SEO. HTML5 designs can be amazing, interactive and inspirational, but if they aren’t coded properly, Google sees an empty page.

For example, in this post we can find examples of great HTML5 animation. However, here is what Google finds when it crawls the page:

SEO-Web-Design-Javascript-300x105

Many websites that incorporate all the visually appealing aspects that popular HTML5 coding delivers also heavily incorporate JavaScript, which makes it difficult for search engine bots to understand the content. It is possible to show static content that represents your HTML5 content so that bots can better index your website.

Unfortunately, few website developers take the time to make a static version of the content for search bots.

When Less Is More

Normally when I hear the words, “We need to reduce some content,” I immediately think “SEO disaster.” When I am done shuddering, I remind myself that sometimes less is more.

As a result of recent changes by Google like Hummingbird, marketers are observing that placing too much emphasis on topical subcategory pages could be a bad SEO strategy. Historically, more categories were good for SEO because they meant that we had more content that could be ranked for body/torso terms.

Cognitive Dissonance & Web Design

The theory of cognitive dissonance states that people have a drive to reduce dissonance to create consistency and a feeling of control — to make their expectations match their reality. According to the theory, when people are unable to do so, they will simply avoid whatever is making them feel out of control.

To provide the best website experience, we should let data and the user experience drive our design. Only by doing this can we ensure we aren’t alienating our target audience.

SEO professionals particularly need to be aware of the cognitive dissonance that can occur at the keyword-to-landing-page level. The keywords you optimize for must match the landing-page experience that a searcher would expect.

Because of this principle, I recommend routinely sorting SEO entry pages by bounce rate. Then, start with the pages with the highest bounce rate and double check the types of search keywords that were driving traffic to those pages to ensure that the entry page provided an adequate response to the top entry keywords.

Now that keywords are “100% not provided;” this can still be accomplished by checking keywords from Bing as a relatively good proxy.

Reasonable Surfer Patent

When we design our pages with the end user in mind, we should also be keeping in mind Google’s reasonable surfer patent. According to this patent, the most prominent links and the links that are clicked more frequently pass more internal pagerank.

This means that if you are making changes to your website’s main navigation or if you move prominent links to less prominent locations, there is a possibility ranking declines will ensue. This is just another example of how closely SEO is tied  to website design.

There are many situations where website design choices can help or hurt your SEO efforts. Many of the trends in website design indicate that accessibility and website architecture best practices are still important considerations.

We benefit both our users and ourselves when we design webpages in a way that doesn’t overwhelm or confuse users, but leads them to links, categories or subcategories to find the content they need. Creating pages with our end users in mind and combining web design best practices with SEO best practices is a win-win.

SEO for WordPress

WordPress is a great blogging platform and is very easy to utilize. Often times people forget that worpress’s standard installation is very easy to search engine optimize. It has natively built in features that can help a site owner in getting their site crawled.

Natively, WP comes with a ton of great features for SEO. One of these features is the ability to use .htaccess, or a hidden configuration file, to create hard links, also referred to as permalinks. On top of the natively built-in features of WP, there are a ton of great third party plugins available that are incredibly useful with WP search engine optimization.

Check The Code, Content and Other Site Owners

Blog owners often spend a ton of time customizing their site. While this can be great for making unique sites, it can also hurt some of the standard seo features built into WP. Once you’ve got a site all set up, it’s a good idea to go back to the basics and check a few things. First, make sure all of the code validates and compiles perfectly. Compilation errors can hinder a search engine spider from successful crawling all parts of a site. Second, make sure you have unique content, and lots of it. It is extremely difficult to optimize a site that is completely media saturated. Content is king. Ensuring that there is fresh textual content on a site is paramount.

A site owner should take full advantage of WP’s built in blogrolls, pingbacks and track backs. These features help you link your site to other sites, as well as have them link back to you. Blogrolls are essentially a list of blogs you, as a site owner, are a fan of. The WP engine will track all of the updates from these sites and sum them up in list form on your site. Pingbacks are essentially a pat on the back system. Pingbacks allow site owners to reference each other’s posts in side of their posts. If pingbacks are enabled, the site owner being reference gets an alert.

Next, a site owner will want to consider the structure of their site. A good set of navigational links is hugely important. Search engines will crawl your site by moving form page to page. Bad links can keep the spiders from discovering new content and effectively lowering your page rank. WP comes with a default robots.txt file for search engine spiders, also crawler. Often times as a site gets customized, the area in which content is most often posted is not the default for WP. As a site owner you must pay close attention to your robots.txt file. For example, a typical WP robots.txt file contains:

User-agent: *
Disallow: /cgi-bin
Disallow: /wp-admin
Disallow: /wp-includes
Disallow: /wp-content/plugins
Disallow: /wp-content/cache
Disallow: /wp-content/themes
Disallow: /trackback
Disallow: /feed
Disallow: /comments
Disallow: /category/*/*
Disallow: */trackback
Disallow: */feed
Disallow: */comments
Disallow: /*?*
Disallow: /*?
Allow: /wp-content/uploads

Updating this file can become a cumbersome task, however, there is a great plugin called KB Robots.txt, which can take some of the guesswork out of it. This plugin will allow a site owner to create and edit their robots.txt file from within the WP administration section.

Don’t Forget The Meta Tags

WP stores all of the meta information in the head section of the header.php file. These tags are not installed by default, so it is important to take the time to update them. To put the Meta tags back into your WP installation, open up your WP theme in a code editor. Look for the following line of code near the top of the page:

<title><?php bloginfo('name'); ?><?php wp_title(); ?></title />

Just beneath this line of code you can manually add all of your meta information. This information typically contains a language, author, content description and any copyright information. This Meta information is not dynamic. This means whatever you put here won’t change throughout the entire site.

<META CONTENT="document" NAME="resource-type">
<META CONTENT="Short tutorial on the use of meta tags" NAME="description">
<META CONTENT="meta, tags, html, tutorial" NAME="keywords">
<META CONTENT="global" NAME="distribution">
<META CONTENT="MSHTML 6.00.2723.2500" NAME="GENERATOR">

If the site is a fixed content site, this may not be an issue, but if you’re looking for more dynamic Meta tag information, a plugin is your only choice. The most popular plugin for Meta managing is the “customize meta widget”. This plugin allows you to dynamically assign Meta tags to each page in your site. This is extremely useful in optimizing your site if you have multiple types of content throughout it. The plugin is easy to use and simply replaces WP’s default Meta menu with a built in variation that doesn’t require a link to wordpress.org.

Google Site Maps

XML site maps can be very useful when trying to optimize your WP site for Google. These maps are very difficult to make and maintain the larger a site gets. Luckily, there’s a great plugin for this to. The plugin is called “Google (XML) Sitemaps Generator”. This tool will automatically update and generate a site map for all the types of WP pages. It is very easy to use, and doesn’t require any coding skills.

Optimize Your Theme

After all the basics have been covered, it is important to take a look at the theme and design of the WP page. There are a couple of plugins that can help with this. The first is the breadcrumb plugin. Breadcrumbs are the small links that appear above a post title. They usually look something like this: “Home > Articles > Post Tile.” These links help navigation on your site, and allows the search engines to navigate more freely. This plugin will help with breadcrumb links for posts that fall into multiple categories. It will force a category to be chosen for the breadcrumb links.

One of the easiest solutions to theme optimization is just to choose and download one of the many already optimized templates for the engine. This can save tons of time and effort.

Optimize Your Server

The WP engine is natively very fast and efficient. While there aren’t any direct settings to make to optimize your WP site server, it is important to ensure your server is fast and reliable. Speed and reliability can truly make a different in page ranks and search engine optimization. Look for a host, or server, with good up time ratios and snappy servers.

Summary

Overall, WP is a great tool for a small business owner. Right out of the box it is very SEO friendly and can be easily optimized with just a few small steps. Taking the time to check your Meta Tags, site maps, theme layout, server reliability and robots.txt can make a tremendous difference in the viability of your site. There are a ton of great plugins out there to help with this and this should not be considered a definitive list, so read the reviews and try them out.

  • internal linking
  • keywords
  • sources

5 Keys to Automotive SEO

How Do Web Search Engines Work?

Without sophisticated search engines, it would be virtually impossible to locate anything on the Web without knowing a specific URL.

Search engines are the key to finding specific information on the vast expanse of the World Wide Web. Without sophisticated search engines, it would be virtually impossible to locate anything on the Web without knowing a specific URL. But do you know how search engines work? And do you know what makes some search engines more effective than others?When people use the term search engine in relation to the Web, they are usually referring to the actual search forms that searches through databases of HTML documents, initially gathered by arobot.

There are basically three types of search engines: Those that are powered by robots (called crawlers; ants or spiders) and those that are powered by human submissions; and those that are a hybrid of the two.

Crawler-based search engines are those that use automatedsoftware agents (called crawlers) that visit a Web site, read the information on the actual site, read the site’s meta tags and also follow the links that the site connects to performing indexing on all linked Web sites as well. The crawler returns all that information back to a central depository, where the data is indexed. The crawler will periodically return to the sites to check for any information that has changed. The frequency with which this happens is determined by the administrators of the search engine.


Human-powered search engines rely on humans to submit information that is subsequently indexed and catalogued. Only information that is submitted is put into the index.

In both cases, when you query a search engine to locate information, you’re actually searching through the index that the search engine has created —you are not actually searching the Web. These indices are giant databasesof information that is collected and stored and subsequently searched. This explains why sometimes a search on a commercial search engine, such as Yahoo! or Google, will return results that are, in fact, dead links. Since the search results are based on the index, if the index hasn’t been updated since a Web page became invalid the search engine treats the page as still an active link even though it no longer is. It will remain that way until the index is updated.

So why will the same search on different search engines produce different results? Part of the answer to that question is because not all indices are going to be exactly the same. It depends on what the spiders find or what the humans submitted. But more important, not every search engine uses the same algorithm to search through the indices. The algorithm is what the search engines use to determine the relevance of the information in the index to what the user is searching for.

One of the elements that a search engine algorithm scans for is the frequency and location of keywords on a Web page. Those with higher frequency are typically considered more relevant. But search engine technology is becoming sophisticated in its attempt to discourage what is known as keyword stuffing, or spamdexing.

Another common element that algorithms analyze is the way that pages link to other pages in the Web. By analyzing how pages link to each other, an engine can both determine what a page is about (if the keywords of the linked pages are similar to the keywords on the original page) and whether that page is considered “important” and deserving of a boost in ranking. Just as the technology is becoming increasingly sophisticated to ignore keyword stuffing, it is also becoming more savvy to Web masters who build artificial links into their sites in order to build an artificial ranking.

Did You Know?
The first tool for searching the Internet, created in 1990, was called “Archie”. It downloaded directory listings of all files located on public anonymous FTP servers; creating a searchable database of filenames. A year later “Gopher” was created. It indexed plain text documents. “Veronica” and “Jughead” came along to search Gopher’s index systems. The first actual Web search engine was developed by Matthew Gray in 1993 and was called “Wandex”.

Search Engine

Search engines are programs that search documents for specified keywords and returns a list of the documents where the keywords were found. A search engine is really a general class of programs, however, the term is often used to specifically describe systems like Google, Bing and Yahoo! Search that enable users to search for documents on the World Wide Web.

Web Search Engines

Typically, Web search engines work by sending out a spider to fetch as many documents as possible. Another program, called anindexer, then reads these documents and creates an index based on the words contained in each document. Each search engine uses a proprietary algorithm to create its indices such that, ideally, only meaningful results are returned for each query.

As many website owners rely on search engines to send traffic to their website, and entire industry has grown around the idea of optimizing Web content to improve your placement in search engine results.

Recommended Reading:  How Web Search Engines Work.