Robots.txt File (What It Is, How It Works, & How To Make One)

robots.txt

Not many people take the time to use a robots.txt file for their website. For search engine spiders that use the robots.txt to see what directories to search through, the robots.txt file can be very helpful in keeping the spiders indexing your actual pages and not other information, such as looking through your stats!

The robots.txt file is useful in keeping your spiders from accessing parts folders and files in your hosting directory that are totally unrelated to your actual web site content. You can choose to have the spiders kept out of areas that contain programming that search engines cannot parse properly, and to keep them out of the web stats portion of your site.

Many search engines cannot view dynamically generated content properly, mainly created by programming languages, such as PHP or ASP. If you have an online store programmed in your hosting account, and it is in a seperate directory, you would be wise to block out the spiders from this directory so it only finds relevant information.

The robots.txt file should be placed in the directory where your main files for your hosting are located. So you would be advised to create a blank text file, and save it as robots.txt, and then upload it to your web hosting to the same directory your index.htm file is located. But Yoast SEO to the rescue again.

The awesome people at Yoast SEO included a free tool to help you edit robots.txt file without having to worry about the technical issues of accessing your server.

In order to access it you need to go to the “Tools” tab inside of the Yoast “SEO” options in your dashboard.

Once you have done that just click on “File Editor”.

This is where you can make any desired changes to your robots.txt file. If you do not see one, you can follow the on screen prompts to create one.

If you need more information on your robots.txt file, I’d recommend taking a look at the Yoast SEO Example.

Sitemaps with Yoast (What Are Google Sitemaps?)

sitemap

If you own or maintain a website or intend to own one, wouldn’t it be great if you get frequent visitors who find satisfaction in getting exactly the information they need from your page?

While that satisfaction largely depends on the content of your website, how you get to be accessed by website users is the most critical factor of building a website. If your website can’t be reached universally, you defeat the very purpose of the internet: that is, to make information available to any website user from across the world. (at least within your desired market)

How you get to be accessed is actually a matter of presentation style, organization, and most importantly, how fast and extensive search engines get to lead users to your website. Unless your pages are indexed in the search engines they can’t send you the free visitors you are all looking for.

Google Sitemaps

Fortunately, the search engines want your content too and there are a number of ways you can help them, which they encourage you to do – by creating sitemaps of your website. Sitemaps created for the various search engines will enable these search engines’ “spiders” to crawl faster, more systematically, and more extensively into your website’s pages.

By doing so, you get the maximum exposure you can. Such exposure will boost your pride in having your pages viewed, read, and used by more and more visitors the way you intended them to. On the financial aspect, the more visitors your website gets, the higher your website’s potential advertising value.

Now with the vast expansion of websites on the internet, there are different types of sitemaps, each having its own complexity in setting up.

HTML Sitemaps

Creating an HTML sitemap linked to and from your home page is something savvy webmasters have been doing for years and perhaps is the simplest to create. This sitemap is simply a list of pages contained on your site and enables the search engines spiders to easily find your pages, especially the ones that are linked deep in your website that they may have trouble finding otherwise.

TEXT Sitemaps

A text sitemap is simply a list of the URLs of your site in the form of a text file. These can then be submitted to search engines such as Yahoo! to notify them that all the pages exist and by doing so invites their spider to visit.

XML Sitemaps

Google launched Google Sitemaps as a way for webmasters to give them the information they could use to better crawl their sites. This involves creating an XML Sitemap for which they provided their Google Sitemap Generator. This can be the most complicated to set up using the tools provided by Google as you need to be running Python on your server. It’s perhaps the most important one too given the current dominance of the search engine.

Don’t Worry There Is A Shortcut To Create XML Sitemaps For WordPress.

If you are following along with the “Blitz Challenge” we installed a free WordPress plugin called “Yoast SEO”. This plugin is an awesome SEO plugin for their analysis tools when posting blog posts. But, it also comes with some awesome tools.

How To Create A XML Sitemap For WordPress With Yoast SEO

Luckily, by default, when you install Yoast SEO on your website, it automatically creates a sitemap for your website. When it comes to creating your WordPress menus you may want to have a link to it so you can put it in your footer. You can get that link by going to:

SEO > General > Features (Tab)

If you click on the ? icon next to xml sitemaps on that page it will reveal to you the link to your sitemap.

If it is checked “Off” just simply check it “On”.

Understanding SEO (Search Engine Optimization) The Breakdown

seo search engine optimization

The first goal of any search engine optimization strategy is to get your website and all pages indexed. But even before that can happen, you need to get the search engine crawlers to visit your website. Depending on the search engine or directory and the overall circumstances (how you invite and solicit crawlers), that first visit could take days, weeks, or even months.

While it’s true that the initial crawler visits can be somewhat unpredictable (or take a long time in coming), once the ice is broken, future visits can be controlled to some degree…

Basically, the more frequently you update your pages, the more frequently the crawlers will show up on your website doorstep. Which is one of the reasons I recommend blogging on a regular basis, but more on that later.

Of course, that’s only half the battle. The other half is getting the search engines and directories to actually index your pages.

In order to do that, you need to start at the beginning. And the beginning in this particular instance is developing and enhancing pages in such a way that the search engine crawlers will be impressed.

The overall search process is simple…

All the text content that search engine crawlers gather is stored and indexed. People conduct searches based on certain phrases (keywords). Whatever content possesses the most relevancy with regard to any given keyword will be placed in the top positions of the search results.

Since the title of the page and the text content generally carry the most weight – at least with regard to what search engine crawlers deem most relevant during their visits – it stands to reason that improvement in page rank and/or search results listing can most often be attributed to having individual and specific keywords properly incorporated into those two prime areas.

Of course, if keywords were the only basis for which page rank and position in search results were determined, optimizing web pages would be pretty much cut and dried…

Pick a keyword > use it in your title and throughout your content >

achieve high page rank and top position in search engine results

The problem is, there are so many variables that not only come into play but change on a regular basis, it can seem as though achieving solid and effective search engine optimization might never be possible.

Fortunately, it’s not only possible, it can be relatively painless as well. All you have to do is satisfy the top three requirements of pretty much all major search engines…

  • Provide quality content
  • Update content on a regular basis
  • Get numerous top-ranking websites to link back to your site

The search engines and directories you should be trying to impress the most are the top three contenders…

  1. Google
  2. Bing
  3. Yahoo

Beyond that, there are countless other search engines and directories.

Should you optimize for those as well, or simply level your sites on the major players and bypass all the search engines and directories below them? Not necessarily. You still want your pages listed in as many locations as possible. You just shouldn’t try to satisfy every one of them with regard to optimization.

Satisfy the top contenders. Then, if you have the time and ambition to broaden the scope of your SEO efforts, do it. If not, don’t worry about the hundreds (or even thousands) of other search engines and directories that exist.

You’re only human. And just meeting the optimization criteria of the top three is going to be challenging and energetic enough.

Of course, unless you plan to make search engine optimization your life‘s work, it’s not likely you’ll invest most of your energy in that one single area (even when restricted to the top three players). But you do need to invest a fair amount of quality effort.

That basically equates to these two missions…

1. Get your pages indexed by major search engines.

2. Improve your page rank and position in search results.

In order to accomplish both of those, you need to carefully balance the line between good optimization techniques and the urge to take things a bit too far.

In other words, you need to make certain you carry out your two missions without stepping over the line into what’s commonly referred to as “black hat” search engine tactics.

That dark and evil territory would include things like…

Keyword Stuffing – repeating keywords over and over again for no logical or practical reason

Hidden Content – including keywords or text that’s the same color as the background for the purpose of manipulating search engine crawlers

Doorway Pages – not intended for viewers to see but rather to trick search engines into placing the website into a higher index position

Although these types of practices were once considered intelligent and effective methods of optimization, they can now result in having your website banned from search engines entirely and unfortunately forever.

In general, it’s better to concentrate on the most popular and most reasonable optimization techniques. By doing that, you’ll not only achieve the results you’re looking for, your efforts will have long lasting results.

When you consider how much work is involved in getting any website to the top of search engine rank and position, it’s worth whatever effort it takes to get it right the first time.

Search Engine Strategy Basics

For the most part, there are three basic things you’ll need to do in order to accomplish proper and effective search engine optimization.

  • Compile keyword lists
  • Publish keyword-rich content
  • Establish a beneficial link strategy

Keywords

The core of any SEO strategy is built almost entirely around the group of keywords you choose to target.

The first order of business is to decide which groups of keywords you’ll be utilizing. In most instances, those groups will be either directly or indirectly related to the topic or niche that your website is (or will be) associated with.

Once you’ve established the individual groups of keywords you want to target, you can begin to compile a comprehensive list of top-level phrases that have each of the following characteristics:

  • Are searched for by thousands of viewers each and every month
  • Have little or no competition associated with it

The more people who search for the term combined with the least amount of competition associated with it, the more valuable the keyword will be with regard to gaining automatic search engine traffic.

Beyond that, you’ll want to compile lists of secondary keywords. These would still be valuable, but not to the extent that the first top-level list would be.

The main advantage of lower level keywords is the fact that you don’t have to work quite as hard to get definitive search engine recognition. And since you’ll automatically get fairly decent results position, you’ll also receive additional targeted viewer traffic.

To make up for the lack of quality in the keyword itself (in most cases that equates to fewer searches being conducted every month and therefore less competition), you need to work with a much larger quantity of lower-level keywords.

Basically, the results will be just as good as what you experience through top-level keywords. It will just take more keywords to achieve those same results.

There are several ways in which you can compile keyword lists. One of the quickest and easiest methods is to use is Google & their keyword planner.

Now, I would love to go over all the steps to use the keyword planner but I really don’t want to take up more time than I need to so I’d recommend you taking a look at this overview:

www.digitoolbag.com/keyword-planner

Quality Content

There are numerous reasons why “Content Is King”.

From a viewer’s perspective, content not only invites them to visit your website but encourages them to return on a regular basis.

It’s a relatively simple equation…

They’re looking for valuable information. Give it to them.

From a search engine perspective, content is one of the primary factors in determining just how much weight or importance should be given to any web page.

Unfortunately, this one isn’t quite as simple an equation…

Search engine crawlers gather and index content. Figure out how to make them place your content higher on the results ladder than some other website.

Of course, in order to become King, content needs to be of considerable quality. In order to remain King, content needs to be updated on a fairly regular basis.

Not to mention the fact that you also need to add content (new pages) on a regular basis. If not, whatever ground you initially gain will simply fade away. And so will whatever search position or rank you’ve achieved.

Linking Strategies

Choosing the right keywords and publishing quality keyword-rich content puts you approximately two-thirds of the way toward optimum search engine recognition. The other third is pretty much solely based on popularity.

If we were talking about popularity in the real world, it would probably include simple things like who was voted King and Queen of the high school prom, or who had the most date options on a Saturday night, or which sibling got the most attention from Mom or Dad.

In the world of search engines, popularity takes on a whole different meaning. And in most instances, it comes down to this… the website with the most quality links pointing to it wins the contest.

Link popularity

That’s the game. And the ultimate goal is to get countless “important” websites (those that have a theme or topic that’s similar to yours) to provide links back to you. Of course, when we’re talking about importance, we’re referring to how major search engines view them.

Most often, that equates to high page rank and top position in search results. The higher up the food chain a website happens to be, the more powerful any link they provide back to you is perceived.

In order to get the most bang out of the link popularity process, it’s best if you actually seek out valuable websites. Aside from those you might already have in mind, conduct searches based on the keywords you’re most interested in gaining search engine recognition for.

Naturally, someone who’s in direct competition with you wouldn’t even consider giving you a link back. So what you’ve really looking for are popular websites that have content or products that are either complimentary to yours or are indirectly.

For example, let’s say your topic and keyword is based on ways of perfecting your golf swing. Good link back choices would be websites with the following themes or products:

  • Information about golf courses or golf tournaments
  • Golf equipment or apparel
  • Golf instructors or seminars

If the topic is related to yours and the website that’s providing the link back carries a good deal of weight with major search engines, the value of your own website will automatically be elevated.

When it comes to the actual link that these valuable and important websites place on their pages…

Always encourage the use of text links rather than just a URL. For example, instead of simply displaying https://digitoolbag.com as the link back to your website, you want something more substantial and keyword rich. And, of course, search engine friendly.

If one of your keywords is “targeted traffic”, for example, the link might read as follows:

Get Exclusive Marketing & Webdesign Tools & Resources With DigiToolBag Free Membership.

That not only gives you credit for the keyword, it encourages the search engine crawler to perceive your website as having more value.

Always keep in mind that in this particular instance, quality will always win out over quantity. Yes, you want a vast number of links pointing back to your website. But given a choice, you’re much better off with fewer links from important websites than countless links from sites that don’t carry much weight with search engines.

NOTE: While backlinks are important in getting ranked for your desired keywords, only real links are good for you. So please head my advice when I say DO NOT UNDER ANY CIRCUMSTANCES BUY BACKLINKS. While I fully support sites like Fiverr, there are always bad apples. When you buy backlinks, especially in bulk, there is no guarantee.

Basic Overview…

Most search engines use meta description and keyword tags. High score for the overall weight and proximity of keywords, < h > tags, and bold text. Rewards quality content, anywhere between 500 to 1600 words. Content should include keywords in text and links. Likes to see keywords in the page title (utilizing 90 characters or less) and carried consistently throughout the website. Especially values link popularity, themes, and keywords in URL‘s and link text. The use of excessive keywords, cloaking, and link farms is viewed as spamming and/or “BlackHat SEO”.

What Not To Do…

After all your hard work getting your web pages optimized, the last thing you want is to do something that would prevent your site from getting indexed. Or worse, have it blacklisted by search engines altogether.

At the top of the “don’t do” list is the use of invisible text (the text is the same color as the background ). Most every search engine is wise to this practice and will currently ban any website found to be using it.

Here is a quick rundown of everything else you should never do…

  • Don’t repeat keywords excessively.
  • Don’t place irrelevant keywords in the title and meta tags.
  • Don’t make use of link farms.
  • Don’t submit to inappropriate categories in search directories.
  • Don’t submit too many web pages in one day.
  • Don’t publish identical pages.
  • Don’t use meta refresh tags

No matter how good your website is – no matter how valuable the content it contains or how legally optimized it might be – if you use any of the things spelled out above, you run the risk of being blacklisted, branded as a search engine spammer.

Although it varies from one search engine to another, spamming can include one or more of the following:

  • Irrelevant web page titles and meta description and keywords tags
  • Repetition of keywords
  • Hidden or extremely small text
  • Submitting web pages more than once in a twenty-four-hour span
  • Mirror sites that point to different URL addresses
  • Using meta refresh tags

When it comes to directories such as DMOZ (which have human editors), spamming generally equates to one of these three practices: deliberate choice of an inappropriate category within the directory; marketing language; capitalization of letters

It’s not difficult to stay out of black hat territory. But it’s certainly difficult to recover from having used those types of techniques. That is, assuming you can recover at all.

Just pay attention to the rules established by search engines and directories. And since Google is the player you’ll most want to satisfy, it’s important that you read and re-read their webmaster guidelines which are published at http://www.google.com/webmasters/ on a regular basis.

Break the rules and you’ll always be struggling to gain benefit from all the major search engines. Follow the rules and you’ll establish web pages that will not only be around a long time, they’ll always be in contention for top search results position.

Now that we have gone through that … WE ARE DONE!

Should I Hire An SEO Company Or Should I Do It Myself? SEO Questions

hire seo company

Are you thinking “Should I hire an SEO company” but you’re not sure if it’s the best decision for your business? Let’s consider a few things first.

We’re now in an age where every business owner is building their own website. You need to find a way to make people notice your website, especially now that there are a lot more business websites than before. To ensure people actually find your website, you will need to have some understanding of “search engine optimization” or SEO.

You may have heard about SEO in many contexts related to the web, but you didn’t actually consider using the content enhancements yourself. SEO is considered a vitally important part of creating content for your website since it is what makes your site get found on the internet.

Many business owners will hire an SEO consultant to make their websites more relevant and visible on the web. Even though they may use different local search engine optimization techniques, they’ve often not used those techniques properly. A local SEO consultant, however, can advise business owners like your self on how to use “Proper SEO Techniques”.

Search engines like Google as an example, appreciate high quality, unique content. They especially like content that’s both well written and relevant to your market, industry, products or services. Google is constantly changing their search ranking algorithms (how they decide which websites show up first for a given search term) of their service to filter out bad or irrelevant web content. People won’t see your content if it becomes hidden by search engines like Google. By far the easiest and fastest way to deal with this problem is to hire an SEO consultant to help you.

 

Should I Hire An SEO Company?

If you feel you need someone to help bring your web content to the next level (say from page-27 to page-1), there’s a variety of SEO consultants out there to get you started.

An SEO consultant can help you to create or restructure the content on your website to ensure that Google’s search programs will understand it and rank it better. A good SEO specialist will work to understand the main goals of your business and marketing campaign, so they can design a great local SEO strategy to earn you the web traffic that you need and deserve.

Website traffic is important to the bottom of your business which is usually revenue flowing in from new and repeat customers.

Is it even worth it to hire an SEO consultant

If you’re confident enough in studying proper SEO protocol, you may not need the skills of a professional “SEO Expert”.  There are resources available online for you to learn enough about it to do it successfully your self.

I hope this helped you toward your decision on if you should hire an SEO consultant or company and remember to comment below. Oh and of course… share it with your friends!

If there is anything I could do to help you, feel free to ask.  I’m glad to help as much as possible. You can reach me at by using the contact page or by sending me a DM in Instagram

Also, if you need more info on SEO you can check out MOZ. To be honest, there isn’t a better blog and information hub on Search Engine Optimization than MOZ! They also have a bunch of awesome FREE Tools For SEO!