Why you need to get back to SEO basics
You can be well-versed on all the latest SEO trends, but columnist Ryan Shelley notes that you need to get the fundamentals down first.
Do a quick search on Google for “SEO tips” and you’ll get over 14 million results. That’s a lot of tips to wade through when trying to figure out the focus of your SEO strategy. What’s more overwhelming is that’s just one search.
Each year there are new posts of list of the “hottest” tips and tricks that are “guaranteed” to work. While many of these tips are great, to really see results, you need to have a good foundation. In this post, I want to talk about getting back to the basics of SEO and why they are essential to long-term success.
When it comes to optimizing your site for search, the basics are some of the most important, yet often overlooked, aspects of SEO. The recent push of “content is king” has also caused many to forget the essentials and just focus on content distribution.
Here’s the deal: you can post all the content you want, but if your site isn’t optimized, you’re not going to get the rankings you want. So here are few basics you should cover before ever diving into the more complex elements of search.
Crawler access
If search engine crawlers have a hard time crawling your site, they’ll have a hard time indexing and ranking your pages, too. As a site owner or SEO, your first and most important job is to make sure that your site is crawlable. Using the robots.txt file, you can help direct and assist the web crawlers that are crawling your site.
There are certain pages on your site that you probably don’t want the crawlers to index, such as login pages or private directories. You can block files, pages and/or directories by specifying them as “disallowed,” like so:
User-agent: *
Disallow: /cgi-bin/
Disallow: /folder
Disallow: /private.html
You can also block certain crawlers from accessing your site using the following (replace “BadBot” with the actual bot name you’re trying to block):
User-agent: BadBot
Disallow: /
Just be careful when blocking crawlers from your entire site; in fact, don’t do it unless you know for a fact that a particular bot is causing you trouble. Otherwise, you may end up blocking crawlers that should have access to your website, which could interfere with indexing.
If you are using WordPress, there are a number of plugins that can help you do this. If you are not using WordPress, you can also easily set up a robots.txt file on your server. Learn more about robots.txt here.
After you’ve created your robots.txt, it’s important to make sure Google can crawl your site. To do so, you’ll first need to create a site map. This can be done manually or with third-party tools. (If you have a WordPress site, there are many plugins available to create site maps for you.)
Once you’ve created your site map, log in to Google Search Console. (If you haven’t set your site up on Search Console, check this out.) You’ll want to upload your site map by going to “Crawl,” then “Sitemaps” in the left-hand navigation, ten clicking on the “Add/Test Sitemap” button in the upper right-hand corner. From there, you can test the site map and submit it to Google for indexation. (Note that it will take some time for Google to crawl and index your site.)
If you have already submitted a site map and just want to test/submit an individual page on your site, you can use the “Fetch as Google” feature, which is also under “Crawl” in the left-hand navigation.
- Once logged in, click “Crawl” in the left-hand navigation.
- Then select “Fetch as Google.”
- From there, enter the URL path of the page you want to test and click “Fetch.” (Leave this blank if you want to test the home page.)
- Check status. It should have a green check and say “Complete.”
- Click “Request Indexing” if available.
Making sure that Google can crawl your site is essential to getting indexed. Without having your site indexed, you will not rank no matter what you do.
Site structure
In today’s mobile-first, user-obsessed web culture, we sometimes overlook the simple and practical. While I am all for a good user experience and a huge believer in being mobile-first, I also believe we can’t forget the search engines. Having a solid site structure will add to your user experience and will help you rank better.
While this seems like a simple idea, building a good site structure takes time and planning. Not only does it impact your navigation and site links, it also helps the crawlers better understand your content and context. Site structure is all about putting your content together in a logical fashion. Don’t make your users or the search engines dig to find what they came to your site for. Learn how to create a great site structure here.
Titles and meta descriptions
Titles and meta descriptions are some of the most basic elements of SEO. While “titles” are considered in the ranking algorithm and descriptions are not, they both are still very important. Google may not use descriptions as a ranking signal, but that doesn’t mean they ignore them. The crawlers still read the descriptions — and any chance you have to tell the crawlers about your page, you should take it.
The title and the description are often the first things your potential visitors come in contact with in the SERPs. Here are a few tips for creating better titles and descriptions.
Titles
- Optimize your title tag around the core focus of your page.
- Don’t “keyword stuff.”
- Stay within 50 to 60 characters.
- Make it relevant to your users.
- Don’t have duplicates.
Descriptions
- Make it action-oriented.
- Add your primary keyword.
- Make copy easy to understand.
- Stay within 135 to 160 characters.
- Don’t have duplicates.
Having better titles and descriptions can lead to higher click-through rates and increase the visibility of your site in search. It’s important to note that if Google thinks your provided meta data doesn’t meet the users’ intent, they will alter it.
Before jumping into the latest and greatest SEO tactic, make sure you do the basics first. It’s amazing what a few simple tweaks and adjustments can do for your site and overall online marketing strategy. Make sure your site is crawlable, create a structure that is both user0 and search engine-friendly, and take the time to create better titles and descriptions. Doing the basics will help you build a strong foundation for long-term success.
Contributing authors are invited to create content for Search Engine Land and are chosen for their expertise and contribution to the search community. Our contributors work under the oversight of the editorial staff and contributions are checked for quality and relevance to our readers. The opinions they express are their own.
Related stories
New on Search Engine Land