Blog Layout

Why Your Site Is Indexed But Not Ranking

Joe Balestrino • Dec 08, 2020

Indexing Doesn't Mean You're Going To Receive Traffic

Getting your website indexed but not ranked is one of the most common problems that beginner SEOs and sometimes even veterans have to deal with. There are many things that could cause your website not to get indexed some of which are of a more technical nature, while others are more manual in nature. But first things first what is Indexing.


What does it mean to be indexed?


Indexing refers to how the search engines spider websites and webpages so that they can include them in their databases. All search engines have bots that crawl tons of websites every day to find new websites and pages to add to their database. As such, a website or page is said to be indexed when it is included the database of the search engine such as Google or Bing.


You have to remember that indexed does not mean that you are ranking but simply that your pages and content are in the database. The content may be in the database but it is up to the search engines to rank or decline to rank it due to either technical or quality issues.

Is my website indexed?

The best way to find out if your page is in the index is to make use of the site: example.com operator on Google. Copy and paste your domain into the operator as shown below and you should see which pages are indexed. Once you have checked which pages are indexed, you can then move on to determine the cause of keywords not ranking.


➡️ Why Your Keywords Are Not Ranking


There are many reasons that may cause a keyword not to rank, some of which may have to do with issues with crawl and indexing while others are about technical aspects on the website. The following are some of the most common reasons why your keywords are not ranking:


➡️ You Have Chosen Keywords That are Too Competitive or Too Broad


A lot of beginners fall into the trap of trying to target keywords with high search volume. While ranking for high volume keywords is not impossible, it may require a lot of work, skill, or money which a beginner may not necessarily have. By targeting such competitive keywords. You will also be competing with more experienced SEO’s who have the money, skill, and time to rank for such difficult keywords. While you can target competitive keywords as a long term strategy, you are likely not to see rankings for a long time if you do not have the time, resources, and money to compete.


➡️ Your Website is New


If your website is too new it can be very difficult to rank for keywords given that you have done little link building and you are also in the Google Sandbox. This means that you will have little domain authority or page authority to rank for anything in the short term unless you target obscure keywords.


To rank for your keywords, you most likely need at least one of either domain authority, page-level links, or high-quality content or a combination of two or more of these components. Moreover, new websites tend to have problems ranking as there is a Sandbox period of between 3 to 6 months before your website starts being taken seriously by the search engines. During this time you may not rank for much of your targeted keywords, even if they are indexed.


➡️ Poor Quality Content


Ranking for your keywords with high-quality content alone is possible but the reverse is also true. If you have very poor quality content, you may find it very hard to rank even if you have backlinks and high domain authority. The reason for this is that poor quality content usually provides negative signals to the search engines.


If your content is not that good, it may result in pogo-sticking (where users click on the result and immediately click the back button), low time on page, and high bounce rates which may cause Google to de-rank your keywords.


Keep in mind that long content does not necessarily equal high-quality content. High-quality content is content that satisfies user intent.


➡️ You have Un-Optimized Title Tags


A title tag refers to an HTML element that specifies the title of the web page. The search engines will display the title tag on the search results which is what the user will click to go to the given result. As such they are very important and even critical for social sharing, SEO, and general usability.


➡️ The Importance Of The Title Tag


It is critical to have the title tag and meta title for your pages for the following reasons:


✅ They drive click-through rates, which means more people will visit your web pages. The more people click on the title the more relevant it is seen. This is likely to make your pages rank for certain keywords.

It makes it easier for the search engines to understand the content of the page and rank the page accordingly. As such, by optimizing your titles you make the search engines work of determining your keywords easier.

You Do Not Have Unique Tags


✅ You may have meta tags but if they are not unique they will not be helpful to the search engines or to users. Most of the time, if you have duplicate title tags, the search engine will choose to ignore the title tags and find what they believe is the most relevant content on the pages and use it as a title tag. Since the algorithm cannot be as accurate as you are in choosing the keywords for the title tag, you may find that the page will not rank for the targeted keywords.


➡️ You are Blocking Crawlers


This is common when it’s either a new website or during website redesign when you may have used meta robots or robots.txt file. Both of these will provide the search engines with instructions on how to treat the content on the website or page, especially on aspects to do with crawl or indexing.


The robots.txt file typically provides site-wide information, while the meta robots tag is usually used to provide instructions to the search engines on how to treat content on one page. However, you can also use the robots.txt file to tell the robots how to treat content in directories and pages during indexing.


A simple way to check for how the website is using the robots.txt file is to enter the domain in the browser using the domain/robots.txt operator for instance https://www.example.com/robots.txt


 ➡️ Robots Meta Tag


The robots meta tag will typically be in the header of the webpage to tell the search engines what to do with the content of the page. The two most common directives that are used are the nofollow/follow and the noindex/index.


Index follow – Tell the search engines to index the content on the page and follow the links on the page.

✅ Noindex Nofollow – Tell the search engines to index the content on the page but not follow the links on the page.


➡️ You Do Not Have Enough or High-Quality Links


Links are one of the most important components of the Google ranking algorithms, even more, important than the content on your website or pages. To rank, you need high-quality links pointing to your website and depending on the competitiveness of the keywords you are going for, you may need a lot of backlinks. Backlinks are critical since they are a sign to the search engines that your website is reliable and trustworthy.


However, it is important to note that the quality of backlinks is often more important than the quantity of the links. A link from a highly authoritative and high-quality domain may be more valuable than 50 links from irrelevant and low-quality domains.

How to Get Your Site Ranking On Google?

Every issue discussed above has a fix. Nonetheless, some will require a simple fix while some will call for something more substantial that may take time and money to implement. You may also need to buy some tools or use the free ones to fix the issues with your keywords not ranking on the search engines.


The Google Search Console


This is one of the most important tools for finding out and fixing issues with indexing and ranking of keywords. With search console, you can determine if Google has indexed your website and for which terms. Once you have registered and set up your account you can check for:


➡️ Crawl Errors – If there are any issues with crawling, a lot of your keywords will not be indexed and will not rank. You can then fix those errors and have your keywords indexed and ranking.


➡️ Indexing – While not all pages are going to be indexed, you should have most of your pages indexed. If most pages do not show, then you need to fix it.


➡️ Redirect Errors – You can always find any redirect errors in search console and fix them. Pages may not rank for keywords because they may be redirecting to other pages.


➡️ Robots.txt Tester Tool – Google provides a robots.txt tester tool that you used to identify any issues with the robots file. You can also test the pages to determine if the robots file as currently configured may be blocking the Google spider. If a directory or page on the website is being blocked from indexing it will be after the Disallow: in the robots file as shown below. This will prevent the indexing of any pages found in the given directory.


Note that you do not have to fix all the errors in Google Search Console. All you should do is fix all the obviously large issues that are preventing the website from getting indexed and your keywords from getting ranked.


➡️ Use Sitemaps - Implement XML sitemaps to deal with index and crawl issues. If you do not have an XML sitemap, Googlebot will not be able to crawl or index any of your pages. The XML sitemap is a listing of all the pages on your website and will be critical if you have new content that may not have backlinks from outside sources pointing to it. With the sitemap, the Google Spider will find and index all of your pages within a few hours or days of the content being updated.


Most content management systems now come with a capability for making an XML sitemap. If you are using WordPress you can always use the Yoast SEO plugin or any other free plugin out there to make your sitemap. Once you have made your sitemap, log into Search Console and fetch as Google, which will ensure that the spider will know where the sitemap is located and will find it easier to crawl your website and index new content.


➡️ Create More Content - The more content you have, the higher the likelihood that you are going to rank for more keywords. You need to create blog posts that are in the form of answers on how your services or products work or can be used. The reason for this is twofold:


  • Your website will be crawled more frequently with more content
  • Your website will come to be associated with the industry due to having co-citation, which means you will rank for more keywords in the given industry.



➡️ Optimize Your Title Tags - You need to optimize your title tags to improve your chances of ranking in the search engine results. You can do this by:


  • Limiting your title tag to no more than 60 characters
  • Including your brand name in the headline phase
  • Using your target keywords as close the start of the title as possible
  • Installing a tool such as Yoast SEO, which will help to add unique title tags to each page
  • Create unique title tags for each page
  • Write title tags for people rather than just for Googlebot
  • Communicate benefits and value to improve click-through rates
  • Use best practice for writing effective headlines
  • Fix Low-Quality Content


Good content means content that your users are looking for. It typically means in-depth content that answers the query of the user. The best way to fix low-quality content is to research your competitors that are ranking for your keywords and then do better than them.


Be better at keyword optimization, cover different angles in the topic, go more in-depth to stand a chance of ranking in the search engines. Content that best answers the user queries and keeps users engaged will send positive signals to Google which will help rank your keywords.


➡️ Build High-Quality Backlinks - You can check the quality and quantity of your backlinks through tools such as Small SEO Tools, Ahrefs, Semrush, and many other free and paid tools. These tools will show you the best links that are pointing to your website or your pages, which you can then compare to those of the competition. You can then reverse engineer the backlinks that your competitors have and replicate them to rank higher.


Another way of building backlinks is by writing high-quality and very helpful content, which will over time naturally attract backlinks. If you have high-quality content you can also build backlinks through outreach to other websites or if not, write guest posts with links back to your site to get your keywords and website ranking in the SERPs.


➡️ Target Long-Tail Keywords - If you have been targeting highly competitive keywords, you need to shift focus to target long-tail keywords, which are typically less competitive. If you do not have the money, time and experience to compete for the competitive keywords, you can make progress with long-tail keywords and as you make progress and gain more experience or until you have the money shift to medium and then high competition keywords. With more domain and page authority, you can target more competitive keywords and get them to rank.


As you can see there are many reasons why your site could be indexed but may not be ranking. The items I’ve listed are among the most common that I’ve come across. If your site indexed but not ranking reach out to me. I’ll be happy to help you solve this issue.

Need Help With Your Digital Marketing?

Speak with an expert now!

Request a Free SEO / PPC Consultation Today!
Is Google Guarantee Worth it?
By Joe Balestrino 06 Jun, 2023
Discover the pros and cons of the Google Guarantee program in this comprehensive article. Explore the benefits of enhanced trust, increased visibility, and improved conversion rates, while considering eligibility, verification processes, costs, competition, and customer expectations. Make an informed decision about whether the Google Guarantee program is worth it for your business.
By Joe Balestrino 29 May, 2023
Unlock the secrets to boosting your conversion rates on Google Ads with our comprehensive guide. Learn proven strategies and tactics to refine your keyword strategy, optimize ad relevance and copy, improve landing page experiences, leverage ad extensions, implement conversion tracking, and conduct continuous testing. Increase your conversions and maximize the effectiveness of your Google Ads campaigns with these actionable strategies.
By Joe Balestrino 28 May, 2023
Learn how to improve your Click-Through Rate (CTR) on Google Ads by regularly monitoring and optimizing your campaigns. Discover the importance of analyzing performance metrics, identifying low-performing keywords and ad variations, and making data-driven adjustments. Increase your CTR, attract more clicks, and drive better results with these actionable strategies.
Reduce High Cost per Click
By Joe Balestrino 28 May, 2023
Learn effective strategies to reduce high Cost per Click (CPC) in Google Ads and optimize your campaigns. Discover techniques for refining keyword strategies, improving ad relevance, utilizing ad extensions, optimizing bidding, and continuous campaign monitoring. Lower your CPC, increase ROI, and achieve better results with these actionable insights.
By Joe Balestrino 21 May, 2023
Looking to maximize the performance of your Google Ads campaigns? Contact Joe Balestrino, an experienced expert in the field.
By Joe Balestrino 20 May, 2023
Looking to effectively manage your Google Ads campaigns? Discover the essential daily tasks for optimizing your Google Ads performance. Monitor campaign metrics, analyze keyword performance, refine ad copy, and stay ahead of your competitors. Learn how to maximize your ad spend, review search terms, and leverage ad extensions. Drive better results with regular testing and optimization. Start optimizing your Google Ads today
Show More
Share by: