Algorithm Changes – SEO Optimizers https://seooptimizers.com Professional Search Engine Optimization Services Sun, 18 May 2025 09:36:06 +0000 en-US hourly 1 https://seooptimizers.com/wp-content/uploads/2017/03/cropped-SEO-Optimizers-Logo-120-150x150.png Algorithm Changes – SEO Optimizers https://seooptimizers.com 32 32 How Google Search Works: Crawling, Indexing & Ranking Websites https://seooptimizers.com/blog/how-google-search-works-crawling-indexing-ranking-websites/ https://seooptimizers.com/blog/how-google-search-works-crawling-indexing-ranking-websites/#respond Wed, 24 May 2023 18:06:05 +0000 https://seooptimizers.com/?p=41018 How Google Ranks Websites Last Update The key to achieving success in the modern digital marketplace is search engine optimization or SEO. You want your website to rank above others when people search for relevant queries, which will lead to you outperforming your competitors and ultimately driving more clicks, conversions, and sales.   Without good SEO,…

The post How Google Search Works: Crawling, Indexing & Ranking Websites first appeared on SEO Optimizers.

]]>
How Google Ranks Websites

Last Update May 18, 2025 @ 2:36 am


The key to achieving success in the modern digital marketplace is search engine optimization or SEO. You want your website to rank above others when people search for relevant queries, which will lead to you outperforming your competitors and ultimately driving more clicks, conversions, and sales.  

Without good SEO, it’s difficult to succeed online. Luckily, there are many ways to help your website perform better, and numerous techniques you can use to take your SEO to the next level. Part of this involves understanding the mechanics of how search engines operate, and how they crawl, index, and rank websites.  

How Google Sees a Website

Spiders and Crawlers

One of the first concepts to grasp is how Google views websites. Google and other search engines don’t see the website the same way we do, with interactive modules, dynamic components, and stylized text. Rather, they just see the webpage as computer code in the form of HTML. HTML is the standardized computer language used to display websites on the internet.  

Google uses crawlers, sometimes called “spiders,” to parse through this code line by line and ascertain certain characteristics of the website, which it then uses to rank the site. These spiders can be understood as scripts or robots that automatically parse and analyze the HTML code of a website.  

View Page Source

To view your website similarly to how Google would view it, right-click on the page and select “View page source”. This will open a side window with lines of code, which might appear technical and a little confusing if you aren’t familiar with HTML.  

In this page source, there is only text. This computer language displays images, video, and other elements of a website as lines of code that the browser then turns into actionable items on the screen. 

To maximize your SEO, you want your HTML to be easily understandable, compiled to be as simple as possible, and accurate. The more you can spoon-feed to Google by providing easily understood text, the better your page will rank compared to its peers.  

Google’s crawlers or spiders go through this source code and use keywords and the information available to ascertain what your website is about. That’s why it’s important to make sure your website clearly explains its purpose, category, and function in plain language within the content.  

While it’s not critical to know how to code in HTML, due to widely-accessible website creators, it’s still important to understand the basic functions and the fact that search engines only see websites as HTML.

For example, Google does not “see” images. Rather, it sees a line of HTML code that links to the image that is uploaded on a file hosting server. So, one way to optimize your HTML for Google’s SEO procedures is to always give your image file names that are relevant to what they actually show and your website, as this image title is all Google will be able to see. You can also use “alt tags”, which an HTML attributes applied to images that can further help search engines understand them and improve SEO.

Text is Best: How to Optimize Your HTML for Google

Due to the difficulty search engines can have with interpreting various page elements like images or videos, simple text is the best way you can present data to them. You should attempt to have at least 400-700 words or more per page so that you have enough content for the crawlers to analyze and interpret. Any page that you want to show up in search engine rankings will need content on it, having more written content and keywords on each page will give you a better chance of ranking higher.

Because you want people who land on your page to see a nice, streamlined site with easy access to the links and tools they need, you can put this written content at the very bottom of the page. That way, it won’t clutter up the page or make it appear too crowded, but will still be accessible by Google. 

A good way to figure out how much content you should have on your web pages is the following process: 

  1. Go to your competitors’ sites, or the sites that rank at the top of the results for your relevant keywords.  
  2. Once you are there, copy and paste the text on their pages into a word processor, and check it to see how many words per page they use on average.  
  3. Shoot for slightly more than that so that Google will prefer yours over theirs due to its higher amount of text content.  

Don’t hold back when writing this content; more high-quality written material on your pages is always better. 

The Almighty Algorithm

Once Google has crawled your HTML code and received as much data as possible, it then puts what it has learned into its algorithm. This algorithm is highly secretive and proprietary, and no one except high-level engineers at Google knows the specifics of it.

Additionally, it can change at any moment, which is why SEO is an ongoing and continuous process. Google results change every day, meaning the algorithm ranks people differently every day as well.  

There are over 200 different ranking factors that go into the algorithm, making it a complex operation to understand. However, understanding it isn’t necessary; the only thing you have to do is beat your local competitors in relevant search results.  Essentially, ask not how the Google algorithm works, ask how you can do more SEO than your competitors.  

Algorithm Updates and Changes

Every few months, Google will release a periodic update to their algorithm which can strongly affect search results. Some of these updates seek to penalize websites for not following SEO procedures or those that are doing it “wrong” in their eyes. That’s why maintaining the highest quality SEO procedures possible is always important.  

Google does not like to be “tricked” or deceived by deceptive techniques or websites. They constantly look for loopholes to patch that will disallow people from manipulating their rankings in ways they find undesirable.  

If you are penalized by Google, the worst-case scenario is being kicked off their results entirely, but this doesn’t happen very often. More commonly, those found to be not following proper SEO procedures will be dropped in the rankings, causing them to lose traffic that can only be brought back by buying ads. Always make sure that your SEO is professional and above board in all regards.  

However, you won’t necessarily be told when you are being penalized. While you may sometimes be informed through Google Search Console or Google Analytics, more commonly, you will simply notice that your traffic has slowed. From there, you can check the date the slowdown started and see if Google released an update around that time and what might have affected you.  

Panda, Penguin, and Hummingbird

Panda, Penguin, and Hummingbird are names given to various updates to Google’s algorithm, which attempt to regulate or control which websites show up on their results. Their goal is to prioritize organic, original content with a high degree of expertise, authoritativeness, and trustworthiness while lowering results from irrelevant or low-quality websites.  

Panda

The Panda version of the algorithm was released in 2011 and has had numerous updates since then. Its goal is to regulate websites with non-original, duplicated content or pages with “thin” content, defined as pages with not very many words. The goal here is to provide more points of view and options for those searching, rather than the same article or content in multiple places. 

Penguin

The Penguin update to the algorithm was released in 2012 and targets different areas of SEO. This update targets those who participate in what Google sees as “keyword stuffing” or “backlink schemes”, which are ways to try and trick Google into seeing your website as more highly-regarded or authoritative than it really is.

Hummingbird

The Hummingbird update began the process of optimizing Google for mobile devices and the way people search on their phones, including speaking into it. The Hummingbird update to the algorithm changed the way Google works from searching for each individual word in a query to contextualizing and understanding the search query as a whole. This allows for a more natural way to process language and user intent in each search by incorporating more variables into the search, like geolocation, context, and other factors.

It’s important to be aware of how these algorithms look and what they try to penalize, as getting out of Google’s penalty schemes can be very difficult and isn’t guaranteed.

Q&A

Q: Why is a landing page so important?

A: Landing pages are critical to SEO and online success because they are the page a user will land on immediately after clicking your ads. This is your best chance at making a sales conversion, and people have short attention spans, so you need to make the most of every opportunity you have when someone lands on your page.

Always make sure your landing pages are up-to-date, attractive, function well, and have easy options for the potential customer to purchase your product, such as an easy way to make a sale, sign up for your service, or get in touch with you.

Q: Should you put dashes in between your keywords?

A: Try to use natural language whenever possible. If they normally have dashes, then you should write them out that way, but if it’s normally just spaces, then write them that way. For image filenames, dashes between words can help Google parse the information and understand it better, but this isn’t a requirement.  

Q: What should I do if my website is more image-based, like for an artist?

A: If you’re an artist or your website is primarily image-based, then you just need to add content and descriptive titles to your pages. Try to come up with content that describes your images, and consider adding a few paragraphs at the bottom of your page describing what it is. Also, use descriptive language in your filenames and make sure that you are utilizing alt tags.

Q: Will Google keep ranking pages higher when I add quality content?

A: Yes, Google will keep updating its analysis of your page and will rank your page higher when you continue to add quality content. Make sure not to use the same content for multiple pages on your website, as you will be penalized for this and also find yourself competing against your own website, leading to lower rankings for each page. 

Q: What if I have blocks of content and then a hotlink for the user to read more?

A: That should work, as long as there is enough content on the page for Google to read. Don’t hide the content, and always make sure that Google can understand the basic function and purpose of the page without needing to click the link. Consider consolidating the content if you want the first page to rank higher.

Q: How does Google know what quality content is?

A: Google uses its hummingbird and other complex algorithms to determine what they describes as the authoritativeness, expertise, and trustworthiness of the content. Original and non-duplicate content is part of this, but there are other metrics they look for, such as how many people link to your site, legibility, and how helpful it is.  

Q: Is it better to write one page with a lot of content or multiple pages with less content?

A: What you want to do is make one page with a lot of content and then replicate that for each service you offer. So, don’t split up content unnecessarily, but also don’t have multiple pages that could function as one instead. Each specific service, category, or product you offer can have 1,000+ words describing it, and always focus the most on your home page or top-selling product when it comes to content.   

Q: What is an example of a deceptive SEO technique that Google doesn’t like to see?

A: One example of this is “keyword stuffing”, where you can put a bunch of keywords on your site in tiny, almost-invisible text that is the same color as your website background so it blends in. While we may not usually notice it, Google will immediately recognize this as a deceptive SEO strategy and penalize you for it. 

Q: Does a 1,000-word page rank higher than a 1,000-word blog?

A: As far as Google is concerned, a blog is a page. There isn’t any real difference, so it won’t matter when it comes to SEO. Blogs can be a great way to increase the amount of content on your site once you have described your products and created a home page.

SEO is the most important factor in online success in the modern digital marketplace. Without it, your business will suffer and you will lose ground to your competitors. Contact an SEO professional today and watch your impressions, conversions, and sales soar!

The post How Google Search Works: Crawling, Indexing & Ranking Websites first appeared on SEO Optimizers.

]]>
https://seooptimizers.com/blog/how-google-search-works-crawling-indexing-ranking-websites/feed/ 0 How Google Works: Crawling, Indexing, & Ranking Websites nonadult
How Do Search Engines Work? https://seooptimizers.com/blog/how-do-search-engines-work/ https://seooptimizers.com/blog/how-do-search-engines-work/#respond Tue, 17 May 2022 16:44:56 +0000 https://seooptimizers.com/?p=40781 Google’s Algorithm Last Update Search engines like Google are a ubiquitous part of the modern world, with Google actually added to the Merriam-Webster dictionary in 2006 as a verb. Billions of people around the world intuitively know what “to Google” a word means, and as a consequence, the way we interact with each other and…

The post How Do Search Engines Work? first appeared on SEO Optimizers.

]]>
Google’s Algorithm

Last Update May 16, 2022 @ 10:08 am

Search engines like Google are a ubiquitous part of the modern world, with Google actually added to the Merriam-Webster dictionary in 2006 as a verb. Billions of people around the world intuitively know what “to Google” a word means, and as a consequence, the way we interact with each other and the world around us has been forever changed.

Appearing on the top results of search engines can make or break a startup business or website, and conversely, it’s almost impossible to succeed as a new business these days without making sure that your website will show up in search results.

However, while many may think that the science and technology behind search engines is an obscure and arcane science, it’s a rather ingeniously simple process that can be understood and even manipulated by the average person.

The process of refining and manipulating your search results is known as search engine optimization or SEO, and it is a critical part of running a business in the modern world. Before you can ensure your website or company has optimal SEO, it’s important to understand the mechanics of how search engines work in the first place.

Entering the Information Age

Although many people take it for granted that they can type a search query into Google and get millions of relevant and interesting results, this technology didn’t always exist. What information you had access to used to depend on your geographical location and status in society, and not everyone could find and read the same books or documents.

All of this changed when the internet was introduced, putting the cumulative mass of humanity’s collective knowledge at the fingertips of the masses. In addition to the greater amount of information available, one of the most defining and exciting aspects of search engine technology is how it ranks information we see, giving us the results we were looking for seemingly instantly.

Page Ranking Techniques

At its heart, Google and other search engines can be understood as ranking systems that use a form of artificial intelligence and algorithms to determine what information to show you. While many people might think that Google ranks pages based on how many people have clicked the link or viewed the page, or simply searches web pages for keywords, the way it works is slightly different.

The way Google ranks results for relevance is based on how many links there are to a given page, which are known as backlinks. In addition, Google analyzes the popularity and authoritativeness of each backlink and assigns values to each one.

This PageRank algorithm was developed by Google founders Larry Page and Sergey Brin while they were studying at Stanford and was trademarked by their company in 1998. Only 8 years later, it had revolutionized the world so much that it was part of the lexicon of the average global citizen, and obtained its dictionary entry. Few people have changed the world in such a dramatic and useful way, and the PageRank algorithm continues to evolve and affect the way businesses, people, and institutions interact with the world around them.

What is SEO?

Surrounding this algorithm, an entire industry has sprung up, known as Search Engine Optimization. This can be understood as the science or art of manipulating search results to make sure certain results come up higher on the ranking than they would have naturally.

SEO is something that all companies do, whether intentional or not, and can be understood as simply a more technologically advanced form of marketing for the modern world.

To put it simply, search engine algorithms base their results on backlinks or the amount and quality of links that interact with a given page. Therefore, the best way to achieve SEO is to find ways to acquire more links to your page.

This is often easier said than done, but with effort, assistance, and creativity, backlink networks can be built that will help a page rise in the results. Some of the methods used to build links to a page are:

  • Reaching out to popular blogs or news sites to be featured
  • Running a popular and successful social media page that people interact with frequently
  • Making sure that your website utilizes strong and consistent keywords
  • Contacting an SEO professional for assistance

The goal of SEO is to drive organic traffic, which is mainly achieved by presenting interesting, authoritative, and compelling material. If your content is engaging and truthful, more people will be likely to link to it or discuss it, which will attract more attention from search engines.

The best way to achieve SEO is to present authentic and creative material that naturally and organically attracts and appeals to consumers.

When all this is taken into consideration, search engine technology becomes less of a mystery. These results have a massive impact on the way we interact with the world, but they can be understood and manipulated by the average person, which is a major component of success in the globalized, digital economy.

The post How Do Search Engines Work? first appeared on SEO Optimizers.

]]>
https://seooptimizers.com/blog/how-do-search-engines-work/feed/ 0
13 Things That Google Ignores About Your Website https://seooptimizers.com/blog/google-algorithm-ignore/ https://seooptimizers.com/blog/google-algorithm-ignore/#respond Wed, 19 May 2021 16:06:39 +0000 https://seooptimizers.com/?p=39655 How Google Algorithm Works Last Update SEO has continued to change over the past 15 years, so much that 60% of SEO professionals (from a Twitter poll) find it much harder to do now. Some things that used to work no longer work. Other things have been deprecated and replaced with new things (like keyword…

The post 13 Things That Google Ignores About Your Website first appeared on SEO Optimizers.

]]>
How Google Algorithm Works

Last Update Aug 24, 2022 @ 6:20 pm

SEO has continued to change over the past 15 years, so much that 60% of SEO professionals (from a Twitter poll) find it much harder to do now.

Some things that used to work no longer work. Other things have been deprecated and replaced with new things (like keyword density, replaced by natural language processing).

There are plenty of things that now Google no longer uses, too. Things that used to be helpful up to a few years ago now go entirely ignored.

In this post, we’re going to tell you about 13 things that Google ignores about your website and what you can do with these things instead, so they won’t go completely to waste.

13 Things That Google Ignores About Your Website

1. Low Quality Pingbacks

Google ignores low quality pingbacks to your blog.

That means both good and bad news: the good news is that a bunch of spammy blog pingbacks won’t hurt your site in Google and won’t lead to a manual action. The bad news is that Google will see even good quality pingbacks as worthless. In fact, Gary Illyes said on Reddit that “it’s very very likely those pingbacks are marked worthless (meaning they’re ignored) on Google’s end”.

What to do with pingbacks

While Google ignores these links and therefore they have no SEO value, try can still have traffic value. So we believe the best course of action is to monitor the few good ones you got for referral traffic.

2. Forum Profile Links

Google ignores forum profile links.

That means trying to acquire tons of profile links won’t get you any SEO benefits and it might only make you appear spammy.

What to do with forum profile links

When you register to a forum of your interest, the website field in your profile makes for a great opportunity to drive traffic to your website. But how to leverage that? The best way to drive people to your profile and then to your website is to be an active member of the forum, engage in discussions and give great advice whenever a forum poster requests any. Your good reputation will be the number one traffic factor.

3. Links from UGC (User-Generated Content)

Google ignores links from UGC.

That means any links you place in comments and forum replies will not have an SEO benefit.

What to do with UGC

Comment links are still helpful for their referral traffic value. For example, your website link in Website URL field or any links in the body of the comment that the blog owner approves, but even the links you place in Quora and forum replies. These are all links that can bring interested visitors to your website, so while you can’t go after these for SEO value, you can definitely leverage them for traffic value.

4. Guest Post Links

Google’s John Mueller said on Twitter that Google now ignores guest post links, and that they “catch most of these algorithmically“.

Google ignores guest post links

What to do with guest posts

The fact that Google now ignores links from guest posts doesn’t meant that engaging in guest post activity is a bad thing. Actually, guest posting is a powerful tool for growing brand visibility, thought leadership and inbound (relevant) traffic.

It’s time to run glorious guest posting campaigns — just not for SEO.

5. Meta Description in Search

According to a 2020 study from Ahrefs, Google often ignores your meta description in search, returning other portions of text that deems more relevant instead.

What to do with meta description

Although only shown 37% of the time in search (still according to Ahrefs’s study), writing your own meta description is not a waste of time because it can encourage clicks whenever it’s shown. Your meta description is an exercise in good copywriting.

6. Capitalization in HTML Tags

John Mueller said on Twitter that Google ignores capitalization in HTML attribute names.

Google ignores capitalization in HTML attribute names

What to do with capitalization in HTML tags

Nothing to do here. Just try to be consistent with your capitalization, as Mueller suggests.

7. Keywords Meta Tag

Google doesn’t use keywords meta tag when ranking content.

The reason is that this meta tag was abused decades ago, with website owners and SEOs stuffing irrelevant keywords in an attempt to manipulate web search.

What to do with Keywords meta tag

Unless you use tools and search engines that still use this meta tag to extract information about your website, you can completely disregard it and leave it empty in your website HTML code.

8. Links from Spammy Domains

John Mueller said on Twitter that Google ignores unnatural links from spammy domains.

Google ignores links from spammy domains

That means that you shouldn’t worry about any blatantly bad links your website might get.

What to do about links from spammy domains

Unless they’re causing you harm (e.g. getting you a bad reputation with influencers or sending you spam traffic), in which case you may want to ask for link removal, you can ignore these links altogether and instead focus on getting good links.

9. Nofollow Links

Since 2019, Google treats nofollow links as hints (Bing does, too), meaning that they might use these links for ranking purposes or even just for finding spam. It’s up to Google.

What to do with nofollow links

Getting nofollow links won’t hurt rankings but they may not even help them. They’re still good links for getting traffic anyway.

On your website, use nofollow at your discretion, especially when you don’t want to treat your links as a vote of trust.

10. Affiliate Links

Google treats affiliate links as nofollow links even without the rel=nofollow or rel=sponsored attribute.

Google ignores affiliate links

Source: Twitter

What to do with affiliate links

Use them normally as you would any other link, but we recommend that you use a rel=nofollow on them to help Google better identify them as affiliate links. There is no real reason to make these links pass PageRank anyway.

11. Length of Title Tags

Well, Google doesn’t exactly ignore them — they truncate them in search results, but what they still do is reading your title beyond what’s displayed, and you can still benefit from any keywords there SEO-wise.

What to do with title tag length

Making your title 60 characters long is still a best practice, but it’s not always applicable (e.g. the title of the post you’re reading), so don’t worry about it and make sure your title attracts clicks instead.

12. Pagination

With rel=prev / rel=next now deprecated, Google is leaving it up to webmasters to handle pagination, deeming it more a UX issue than an indexing one.

Google ignores pagination format

Source: Twitter

What to do with pagination

Keep handling it the way you’ve always been handling it. If you make changes, make them based on UX not SEO.

13. The Position of rel= Attributes

Google ignores the position of rel= attributes in the HTML format of an URL.

So whether you have, for instance, rel=nofollow before or after ahref=, it does not matter to Google.

What to do with rel= attributes

It’s up to you. Just keep some consistency with the location of these attributes.

BUT Google Does NOT Ignore Sitewide Links

Neil Patel’s study shows that, albeit less effective than in-content links, sidebar and footer links still help rankings.

That means that obtaining sitewide links is still effective and you don’t need to remove these links because they won’t hurt you (unless they’re clearly spammy).

Conclusions: Don’t Spend Too Much Time and Money on Things Google Ignores

The takeaway from this post is that spending too much time and money on things that Google ignores is not worth it, unless you have a UX and traffic benefit from it.

Some things like guest posts can still be used as a tool for brand visibility even though they might no longer carry value as a link building tactic.

The rule of thumb is that, if your main goal is to optimize for Google, focusing on what Google incentivizes in the SERPs is a much better idea.

Have you been focusing on things that Google ignores?

We hope that this post was helpful to you! Let us know in the comments if you found anything else that Google ignores about your website.

Here’s to your success!

The post 13 Things That Google Ignores About Your Website first appeared on SEO Optimizers.

]]>
https://seooptimizers.com/blog/google-algorithm-ignore/feed/ 0
Passage Indexing: What It Is & How It Impacts SEO https://seooptimizers.com/blog/passage-indexing/ https://seooptimizers.com/blog/passage-indexing/#respond Wed, 21 Apr 2021 16:49:38 +0000 https://seooptimizers.com/?p=39622 Google Passage Indexing Last Update An update that Google launched in February 2021, passage indexing came with the purpose to surface hidden content in search that wasn’t optimized for search engines. The update has been largely discussed in forums and SEO publications like Search Engine Land, and it’s supposed to impact 7% of the search queries.…

The post Passage Indexing: What It Is & How It Impacts SEO first appeared on SEO Optimizers.

]]>
Google Passage Indexing

Last Update Aug 24, 2022 @ 6:20 pm

An update that Google launched in February 2021, passage indexing came with the purpose to surface hidden content in search that wasn’t optimized for search engines.

The update has been largely discussed in forums and SEO publications like Search Engine Land, and it’s supposed to impact 7% of the search queries. It goes without saying that the SEO and webmaster community expect its effects to be massive.

Passage Indexing: Tweet by Google

Source: Twitter

The idea behind passage indexing is to help webmasters rank their pages better, so it’s definitely a good kind of update from Google.

Let’s learn more about it.

What Is Passage Indexing?

Passage indexing — or more correctly, passage ranking, because it’s about ranking, not indexing — is an internal ranking update that uses natural language processing to rank passages of a page that answer a user’s ultra specific query.

Google says:

“Very specific searches can be the hardest to get right, since sometimes the single sentence that answers your question might be buried deep in a web page. We’ve recently made a breakthrough in ranking and are now able to better understand the relevancy of specific passages. By understanding passages in addition to the relevancy of the overall page, we can find that needle-in-a-haystack information you’re looking for.”

This happens every time a passage in a page “is a lot more relevant to a specific query than a broader page on that topic”.

As an example, for the query “where do I find good shoes for knock knees” Google might return the subsection of a “best of” article that talks about shoes for knock knees based on the content of that subsection, not only its heading, and even if the rest of the page talks about different kinds of shoe needs.

Google also provides a visual example of how passage ranked results should appear in search:

Passage Indexing: An example of how it should appear in search, from GoogleAlthough the purpose of the update appears straightforward, there are a lot of misconceptions about passage indexing/ranking around the Web.

See below for clarification.

What Passage Indexing Is NOT

Passage indexing is not Google indexing sections of a page, or passages.

As always, Google indexes whole pages. What changes is how Google’s algorithms look at the content of the page (its passages) to return relevant text that answer’s a searcher’s query.

Also, passage indexing is not a Featured Snippet, which is a portion of text that Google displays in the SERPs to answer a searcher’s question in a quick way, without having the user click on the result.

With passage indexing, you get regular results but based on a passage or a subsection rather than the full page with its main heading. In other words, passage indexing is about ranking, not display.

Passage Indexing: Tweet by Danny Sullivan at Google

Source: Twitter

Finally, there are no penalties linked to this update: it’s simply a ranking change that has the scope of helping users discover content that’s hard to find due to poor optimization.

Suggested SEO Improvements

Passage indexing exists to bring useful content hidden in a page to light, so ideally there’s nothing to optimize for, but you can make Google’s work easier — and up your chances to rank better, especially for your less optimized pages — by using a few best practices.

Content Planning

Content planning should help your SEO, not hinder it.

Don’t create a lot of pages on similar topics. Rather, prefer long form content with comprehensive subsections, where you say all you have to say about a subtopic.

In fact, unless what you have to say requires a lot of space, expert insight and illustrations/graphs/screenshots, you can compress it into a subsection of a long form article (think a 3,000-wordsguide, for example).

Content planning will help you decide what topic to write about in a comprehensive way and what other related topics (again, not similar topics) you can cover so that users can find all the information they need on your site, by following links in the article they’re reading.

For example, you can create a long form guide on “content marketing strategies” and cover each strategy in subsections, then create case studies and expert interviews about those strategies, and cross link the guide and the cases/interviews.

Search Intent

Think about the search intent of the topic you’re covering and even the search intent of each subsection: how can you optimize your content so that each subsection, and not only the article as a whole, stands on its own?

That will make it easier for Google’s algorithms to rank your subsections for specific search queries.

As an example, the “Goodreads groups” subsection of an article on your book blog about getting free ARCs of books may rank for “free arcs goodreads” and users would be sent directly to that subsection of your article, skipping the rest:

Passage Indexing: example of subsection ranking in GoogleHeadings and Subheadings

You may want to pay special attention to headings and subheadings in your page and make sure they are meaningful and descriptive of the content of the section or subsection they introduce.

This is especially critical if you want to rank for subsections and not only the full page. See the previous point on search intent for an example.

Table of Contents

Have a table of contents in each post or page, so that search engines can better rank your subsections (and for users to jump to the subsection they’re most interested about).

If you use WordPress, there are plenty of good free plugins you can use, including:

If you have a static HTML website, you can follow this tutorial by Tips and Tricks HQ to create a simple table of contents.

Focus On Users

With this ranking change, Google is highlighting the importance of content for users rather than for crawlers, so make sure you prioritize your content over SEO practices to be able to answer searchers’ queries thoroughly.

Naturally, that doesn’t mean you should neglect your SEO. Well-optimized content is still content that’s easier to find on all major search engines, and that’s the result you want to achieve: you want to be found easily.

But your focus shouldn’t be on SEO: it should be on bringing value to users so that they see you as a trustworthy authority in your niche.

Conclusions

With the passage indexing update, Google is helping webmasters with poorly optimized but quality pages rank better, which is a great thing and a very welcome update in the webmaster community because it makes it easier to rank good content for specific keywords, even if a webmaster’s SEO skills are poor.

But that is no excuse to not learn and do good SEO for your pages: the more you know about SEO, the better you can rank for more than just passages or subsections.

How did you welcome the news of passage indexing? Are you going to work on your SEO to better rank your pages?

Share in the comments below.

The post Passage Indexing: What It Is & How It Impacts SEO first appeared on SEO Optimizers.

]]>
https://seooptimizers.com/blog/passage-indexing/feed/ 0
BERT SEO: How To Optimize Your Content For Google https://seooptimizers.com/blog/bert-seo/ https://seooptimizers.com/blog/bert-seo/#respond Thu, 10 Sep 2020 16:55:20 +0000 https://seooptimizers.com/?p=37740 BERT SEO Strategies Last Update If you’ve been keeping up with the world of SEO and Google, you may have heard about this “BERT” update Google rolled out in late 2019. Because Google is constantly updating its algorithms, the thing shouldn’t come as a surprise, but BERT is something new compared to previous updates as…

The post BERT SEO: How To Optimize Your Content For Google first appeared on SEO Optimizers.

]]>
BERT SEO Strategies

Last Update Aug 24, 2022 @ 6:20 pm

If you’ve been keeping up with the world of SEO and Google, you may have heard about this “BERT” update Google rolled out in late 2019.

Because Google is constantly updating its algorithms, the thing shouldn’t come as a surprise, but BERT is something new compared to previous updates as it relates to a machine’s understanding of human language.

But let’s see the details of it and why it matters to SEO.

What Is BERT?

BERT is an algorithm update that Google introduced in October 2019. The name is an acronym that stands for Bidirectional Encoder Representations from Transformers and Google uses it to better understand human language and the contextual use of terms in content.

That means that the BERT update affects both web search and voice search, as understanding context and the intent of both words in content and voice queries is at the core of it.

Note that when we talk about context here, we refer to the text surrounding target words.

For a more technical overview of BERT, watch DeepCrawl’s 2019 webinar on Understanding Natural Language Processing and BERT for Search.

Can You Optimize for BERT? (Yes, in 9 Ways)

In other words, given the nature of this update, does it make sense to speak of ‘BERT SEO’?

The short answer is YES — there are areas where you can make improvements for BERT to better understand your content.

We’ll explore these areas in 9 points, some of which come from the expertise of a number of SEO professionals we interviewed through HelpAReporter.com.

1. Good Grammar and Sentence Construction

Because BERT helps Google to understand the meaning of words in queries (intent) and content based on context, it’s critical to be as precise as possible in sentence construction in your content and to review your grammar to perfection.

It may help to have someone else review your content, as it’s hard for the one who writes to catch every typo and awkward sentence construction.

2. A New Way to Look at Keywords

With the introduction of BERT, keywords stay relevant but they’re no longer to be taken as exact-match terms to be found in the text. Rather, they can and should be seen as topics that your text is built around.

This doesn’t change things in practice: you will still use keywords in your content, but you no longer have to worry to reach a certain density or to use the keywords “as is”.

Something to pay attention to is the context around your keywords. Alex Kehoe, Co-Founder & Operations Director at Caveni Digital Solutions, says:

“Position keywords with high context sentences that are historically associated with your desired keyword. For example, if you wanted to target, “Jogging in Minnesota” you may also include sentences about miles, kilometres, or running shoes. All of these phrases and discussions about them will increase the contextual relevance of your, “Jogging in Minnesota” keyword phrase. Topic and associative relevance is the best way to use BERT to your advantage.”

Using specific words for your topic in a certain frequency is also another BERT SEO best practice, as Karolina Gawron, Head of Marketing at Surfer, explains:

Photo of Karolina Gawron for this BERT SEO article. Blonde, long hair, smiling.“To help Google understand your answer is the best in SERP, you have to use specific words in a specific frequency. Smart usage of entities (nouns like people and places) from Google NLP (Natural Language Processing) API is the best guarantee your content is the most relevant to the main keyword.

For example, if you write for the “tasty breakfast”, phrases like “morning”, “coffee”, “fried eggs” will be highly relevant to the query. Words like “evening”, “dinner”, “potato chips” will not be relevant to the topic.”

How to figure out which phrases are relevant to your topic? Gawron says you can “analyze top-performing pages in SERP with Google demo API or use on-page optimization tools to get the phrases picked by their algorithm with the right density.”

List of terms related to the main topic and their density for BERT SEO

Image kindly contributed by Karolina Gawron

“People Also Ask” and “Related Keywords” also play an essential role:

Girdharee Saran, Head of Digital Marketing at MyAdvo Techserve Private Limited, says:

“[Use the] “People also ask” section because these are all long sentence queries and coming directly from people, you can get better results by answering these queries directly in your content. If you include a well-thought answer to these queries and also add FAQ schema, it will add to the understanding of the search engine. Not sure if this can be categorized as a BERT update but using this strategy, we are able to gain more visibility to our content. Also for a few questions, Google picked up our newly written answers.”

People Also Ask example for BERT SEO“Related Keywords” can also help make a difference in your rankings. As Stacy Caprio, Marketing for AcneScar.org, says:

“These keywords are generated using Google’s semantic algorithms as related to your target keyword and you should include them in your content so Google’s BERT algorithm recognizes your content as inclusive and complete.”

3. Focus On Users Rather Than On Search Engines

With BERT, you can focus on answering your audience’s problems more than having a good number of keyword repetitions in your content.

That results in a switch of focus from optimizing for the search engine to optimizing for users.

It’s nothing new at its core, as websites are built for users and not for search engines, but having to rely less on keywords actually frees up your resources to give more to users and spend less on On-Page SEO.

4. Use a Table of Contents

John Matyasovsky, Digital Marketing Specialist at Roofing SEO Webmasters, says:

“Including a linkable TOC helps define the entities (i.e.: topics instead of exact match queries). Since each topic has sub-topics, the organized TOC helps define the context of the language. It also exists for user navigation, which indicates a user-focus.”

There are a number of TOC plugins you can use if your website runs on WordPress, a handy and easy-to-install example being LuckyWP Table of Contents.

5. Use Topic Clusters

International SEO consultant, speaker, blogger Milosz Krasiński explains in detail:

“A Topic Cluster refers to multiple pieces of content which are lumped together by way of a shared topic or sub-topic. These pieces of content will, together, form a comprehensive overview of that particular topic and will provide more than enough to satisfy searches.”

An example of this is a long-form content piece containing several sub-topics of the same topic, in the effort to provide a comprehensive guide for the user to read and put to use.

Krasiński explains how Topic Clusters can help you optimize for BERT:

“It’s a known fact that being visible for an actual topic is much more effective than being visible for just a keyword or phrase. As BERT is all about the context, therefore, it makes perfect sense that by using these clusters, you can create signals for search engines and optimise your SEO – even when it comes to the mega-fussy BERT.”

6. Make Content More Organic and Conversational

The way you present content to your users users can make or miss your content performance in the SERPs.

Niles Koenigsberg, Digital Marketing Specialist at FiG Advertising + Marketing, explains:

“As voice search has steadily risen over the past decade, it has become clear that this trend will be sticking around. BERT is just one of the ways that Google is working to capitalize upon voice search by honing in on the complex details of conversational language. For that reason, it would be wise to slightly adjust the content on your site to sound more natural and organic.

It would also be wise of you to rework your content and make it more simple and succinct. The BERT update is designed to help Google focus on the context of the words used inside of the sentences in your content. That means you shouldn’t beat around the bush with your blogs and site content, and instead focus on being straightforward, direct, and insightful. Try to avoid using unnecessary words that just fluff up your content and increases the reading difficulty. Word count is still important for ranking metrics, but it’s not as important as providing new and useful information.”

In other words, get to the point with your content, using as many relevant words as the topic necessitates.

7. Have an FAQ section on your website

This optimization was proposed by Tarun Gurang, Sr. Digital Marketer at iFour Technolab Pvt. Ltd.  who says an FAQ section on your website is “helpful for the users and generate traffic to the website. Also, the FAQ section consists of all three sections grammar, sentence construction, and the keywords as topics with the form of LSI.”

“And the major benefit of using the FAQ is to provide appealing content for your targeted audience which they want or are looking for. I am sure by doing this, you will see a significant improvement in terms of traffic and keywords ranking for any website.”

8. Pay Attention to User Intent

User intent is “what a user is looking for when they conduct a search query”, says Riddhi Khatri, SEO Executive at Elsner Technologies Pvt. Ltd.

“Google’s BERT is itself an AI framework. It will learn every new information that it gets. The main purpose of BERT is to help google to understand user search intent and the other way to optimize for BERT is optimizing for user intent.

To optimize your website for user intent, identify the user queries that they are searching for, understand those queries and then research your targeted keywords.

Focus on the user before generating content that they want to see or read when they search for their queries. When you write content ask yourself, “Can my readers find what they are looking for in my content?””

9. Contextual Backlinks

Backlinks may be another area where you can optimize for BERT, according to Subro Chakraborty, Co-founder at InfluRocket:

“Topical backlinks are essential from a BERT perspective. If your page has anchor texts that are irrelevant to your topic or the backlinking page doesn’t have contextual content, that backlink might even hurt you as google will consider it spam.”

Conclusion on BERT SEO

According to Google there isn’t much to optimize for BERT, but as we saw in this article, there are actually a few things you can optimize can make your content perform better in Google.

The key takeaway is that BERT SEO is all about On-Page SEO, specifically about the quality of your content and how much effort you put forward to make sure it’s easy to read, conversational, grammatically correct and on-point with the topic you’re tackling.

How has your content been performing since BERT?

The post BERT SEO: How To Optimize Your Content For Google first appeared on SEO Optimizers.

]]>
https://seooptimizers.com/blog/bert-seo/feed/ 0
7 Algorithm Updates Currently Affecting Your SEO https://seooptimizers.com/blog/7-algorithm-updates-currently-affecting-your-seo/ https://seooptimizers.com/blog/7-algorithm-updates-currently-affecting-your-seo/#respond Wed, 11 Dec 2019 17:19:59 +0000 https://seooptimizers.com/?p=36615 Google Algorithm Changes Last Update “I don’t know, let me Google that.” is more than a common phrase in our vocabulary, it’s become a cornerstone of society. In America, for example, roughly ¾ of Americans have access to the internet, and globally 63,000 Google searches happen every second. With sites dedicated to every topic imaginable,…

The post 7 Algorithm Updates Currently Affecting Your SEO first appeared on SEO Optimizers.

]]>
Google Algorithm Changes

Last Update May 16, 2021 @ 12:25 pm

“I don’t know, let me Google that.” is more than a common phrase in our vocabulary, it’s become a cornerstone of society. In America, for example, roughly ¾ of Americans have access to the internet, and globally 63,000 Google searches happen every second.

With sites dedicated to every topic imaginable, how does Google know which ones to display after your search query? The answer is their revolutionary search algorithm. It’s constantly being updated, and in 2018, 9 updates were released each day.

This is a testament to Google’s goal of providing relevant content to its users. This article is designed to explain the 7 updates that are more than likely affecting your website, what is causing your website to be adversely affected by the update, and how you can remedy the situation.

In this article, we will cover the following updates to the search Algorithm

  • Google Panda
  • Google Penguin
  • Google Hummingbird
  • Google Pigeon
  • Google Mobile Friendly
  • Google Fred
  • Google Top Heavy

We will also recommend several tools to help you optimize your content for the internet, and talk about the benefits of using each product, as well as why you should use each one, before publishing an article.

Google Panda Algorithm Update (Most Likely to Affect Your Site)

This portion of the Google Algorithm looks at the quality of the content on a website. The more unique the content is, the more likely the site is going to be pushed up to the top of the search query. 

“Bland” content uses the same terms over and over again, like this sentence, because “over and over” could have been replaced with repeatability, and it would have worked.

What Causes a Site to Be Negatively Affected By Google Panda?

  • Low-Quality Content- If your site has poorly written content full of grammar mistakes, irrelevant images, and content, or a lack of keyword richness hurts your article’s ranking on Google in the long run.
  • Plagiarism- Google Panda loves original content. You can use images, upload videos, and add short sections of text as long as they’re in quotation marks. However, blatantly copying another website will adversely affect your Google Ranking.

How to Improve Your Rating if it’s Negatively Affected By Google Panda

  • Write original content
  • Proofread the page
  • Use Grammarly to check for grammar mistakes
  • Use Copyscape to check for plagiarism

How to Track Updates for Google Panda

Check your site at least once a month with Website Auditor. Use the SEO PowerSuite function to check for plagiarism, websites with content that is similar to yours, and excessive use of keywords or phrases people are searching for.

Google Penguin Algorithm Update

This program is similar to Google Panda, but rather than looking at the text itself; its focus is on the anchor texts and links being used. You want to use a variety of anchors and links, or it will appear as though your page is trying to link out to a particular site, or you’re using the anchor as a way to attract people to your page.

Using links from trusted sites or scholarly sources can positively impact your SEO rating. Besides, reputable sources linking to different domains like .com, .edu, and .org, can also boost your overall ranking

What Causes a Site to Be Negatively Affected By Google Penguin?

  • Being Repetitive- As previously mentioned, whether you are repeating an anchor or the website you’re linking to, both affect your ranking. This is because the Google Algorithm sees this as a way to artificially inflate your Google search ranking. 
  • Linking to Low-Quality Sites- By linking to sites that Google Panda ranks lowly, pages that are irrelevant to the content of your page, or unsecured sites, this can adversely affect your SEO ranking
    • As a quick example, if you have a food blog, linking to a page about the FBI’s most wanted would adversely affect your page’s rating.
  • Keyword Stuffing- Although you would think that this would be what Google Panda would be doing, Penguin does it on a larger scale. This is because it looks for keyword richness in relationship to your anchor texts. If you repeat the same anchor, then you’ll have a lower ranking. Strive for keyword richness, not for repetition when it comes to anchor texts.

How to Track Updates for Google Penguin

Use a tool to check for growth and check your backlinks using a tool like SEO Spyglass. It’s an intuitive program, and you can look at your site’s overall visits, and ranking. If you see any spikes, it may be indicative of a problem with either a site you are linking to or a poorly written site, linking to your page.

Google Hummingbird Algorithm Update

This update is our favorite because it’s a great use of AI. What Hummingbird does is it takes what you searched for, and rather than search for what you typed, it tries to guess the motivation for your search, and directs you to sites that match your intended meaning. 

This update works by keeping track of similar searches, and the sites that are clicked on the most, and have people stay on them the longest, will appear at the top of your search. This allows you to have a better chance of finding what you’re looking for.

What Causes a Site to Be Negatively Affected By Google Hummingbird?

  • Low-Quality Content- If your content is vague, full of grammatical errors, or anything else that would be an issue with Google Panda or Google Penguin, then Hummingbird will not rank that page highly.
  • Not Using Keywords Consistently- When writing an article, think about how people will find your page, and incorporate those searched words and phrases into your article. Have these words in not only the body but also the section headers and, if possible, the title.

How to Track Updates for Google Hummingbird

To get the best results with this update, use keywords and phrases, and think about how people will stumble upon your website. Be sure to include a variety of keywords and add as many similar terms and phrases in your article.

By playing with the syntax and language associated with your site, it will serve as a catchall. Because the site will be keyword-rich, Hummingbird will be more likely to boost you up the rankings, because a variety of keywords and well-written content are positively correlated.

Another tool you can take advantage of in Website Auditor is the Term Frequency–Inverse Document Frequency (TF-IDF). This will allow you to find synonyms that are commonly searched by people when looking up content relevant to your page. 

It also can display usage rates of keywords found on other pages. Both of these features make TF-IDF useful for evaluating the importance of words and phrases associated with your site.

Google Pigeon Algorithm Update

This update is different from the others on the list because it’s focused on geographic location as opposed to content. Another thing that makes Pigeon different from other updates is that there isn’t necessarily a penalty associated with this update, because it would be hard to change the location of a business.

The way Pigeon displays the businesses is through a map of your area, and the closest business is point A, and there is also a point B and C. This allows you to see the general area of where a business is, before heading to it.

What Causes a Site to Be Negatively Affected By Google Pigeon?

  • Distance- Pigeon displays the 3 closest businesses that provide what you’re looking for. The farther the store, the lower it will be on the ranking.

How to Track Updates for Google Pigeon

Since Google Pigeon works through geotags, and there aren’t penalties, rather than focus on updates, focus on how you can incorporate Google Pigeon’s parameters into your text. On your site, you can mention your physical address multiple times, and that may help boost your rankings below the map portion of the search results.

If you use the Website Auditor tool, we’ve been mentioning throughout this article; there’s a roundabout way to boost your SEO in relation to location. Go to Link Assistant, Look for Prospects, select Directories, and enter a keyword and location. By doing this at least once a month, you can be sure that you’re on all the directories relevant to your business.

For example, if you search OB-GYN and San Diego, the directory will show you a variety of pages that list your search parameters. There’s also an email address for each site, so you can email the host, asking for your website to be included. This helps spread links to different parts of the internet to help boost traffic.

Google Mobile-Friendly Algorithm Update

When this update dropped, for a short period, it shook up SEO and rankings in a way that had never been seen before. Rather than focus on content, links, or locations, this update placed more importance on how user-friendly and how aesthetically pleasing a website would be on mobile devices.

A quick example of this update in action is if you Google “Santa’s reindeer names” on a PC and a phone, the order is slightly different, based on mobile optimization. Wikipedia, Holidappy, and Sno.co.uk host the top 3 links for PC, but on the mobile search Sno.co.uk is bumped to second, and Holidappy rounds out the top 3.

What Causes a Site to Be Negatively Affected By Google Mobile-Friendly?

There are a variety of factors that can affect this update, but we’re going to limit our list to 5

  • The Design- When creating a website, you should build it in a way to where it can detect if it’s accessed from a mobile device or a PC, and load the page optimized the device. This is done to improve the functionality of mobile devices because tapping in and out to zoom would discourage users from going to your site.
  • Device Friendly Content- Be mindful of what is on your page before adding it to the mobile version. This is because some software won’t be compatible with all devices. A great example of this is Adobe Flash doesn’t work on some phone models, and this may negatively impact your ranking, due to inaccessibility.
  • Popup Advertisements- These annoy users, but depending on the popup, it could cause issues for the viewer, due to NSFW content, or the possibility of malware. Instead of using pop up ads, consider ads on either side of the text, or used as a way to break up different sections.
  • Lack of Functionality- This point piggybacks off of the design feature in the sense that you need to make text and interactive portions huge. This is because everything will shrink on the mobile version, and if you have large fingers, you’ll select the wrong option and have to go back and change your response.
  • Loading Speeds- The slower the loading speed for the page, images, videos, and GIFs, the bigger hit it takes from this update. When it comes to mobile devices, they close slow pages faster than their PC counterparts.

How to Track Updates for Google Mobile Friendly

As you probably guessed, there’s an application on Website Auditor that can check for how well a site will read on a mobile device. To access this utility, select Content Analysis, then Technical Factors listed in the Page Audit Tab.

This update dings websites that have poor UX and loading speeds on mobile devices. By using this test, you can quickly check the mobile-friendliness of your site.

Google Fred Algorithm Update

Google Fred is the name for a variety of updates that were released that were designed to hurt the ranking of websites focused on generating ad revenue. When these updates dropped, it seemed as though a variety of pages were receiving large ranking boosts or drops.

Over time people were able to piece together some of the characteristics of all the websites that were positively or negatively affected, and each site had three things in common.

What Causes a Site to Be Negatively Affected By Google Fred?

  • Focused on Ad Revenue- Google searches are designed to provide people with answers to a variety of questions. If your site has a large number of ads, then you’ll see a drop in your ranking. This is because Fred is focused on finding well-written content, not generating people money.
  • Poorly Written Content- When we say poor, this is bland, repetitive content. This served as an update to Panda because Google Panda was released 6 years before Fred. By lacking original content, or having articles with grammar mistakes, Google Fred lowers your page’s ranking.
  • Low-Quality UI and UX- Besides the text, Fred focused on the aesthetic appeal of webpages. Google Mobile-Friendly applied to mobile devices, but Fred introduced the same ranking system for desktop users as well. This leads to poorly designed sites, or those with frequent popups to be bumped from the top of the search results query.

How to Track Updates for Google Fred

Review Google Search Quality guidelines and have good-quality content that fits the parameters of those guidelines. Fred can also tell what is worthwhile content and what is keyword stuff ad pages, so consider keyword richness and fewer ads, to help boost your ranking.

Google Top Heavy Algorithm Update

This update works along the same lines as Google Fred, but this update is focused exclusively on ads. Rather it’s focused on pages that have a large number of ads near the beginning of the site; hence, it’s top-heavy.

What Causes a Site to Be Negatively Affected By Google Top Heavy?

  • A large number of ads- If the Google algorithm feels as though you have too many ads at the beginning of your page, you will be moved to a lower position in the search engine results. This is because top-heavy sites are designed to generate ad revenue, and they have low-quality content that is stuffed with keywords to attract people.

How to Track Updates for Google Top Heavy

This page isn’t necessarily updated, but according to Google AdSense, you can have up to 3 ads per page. Also, AdSense has a heat map that shows the best places to insert ads, they’re generally at the top or bottom of a page, or used to break up various sections of an article.

Various Tools You Can Use to Increase the Quality of your Website

To make your site look like it was written by a team of professional writers, you can take advantage of a variety of programs to help create SEO friendly content. In this section, we will take a brief look at five different programs you can use to help boost the ranking of your website.

Website Auditor

This is arguably the most useful tool on the list because it can do the function of the other four and then some. However, it is the most costly option out of the five listed. This is because of the number of features included.

If you host a large number of articles or are trying to broaden the reach of your site, then this is a worthwhile investment, because it’s one of the best programs available for optimizing web-based content.

Grammarly

Grammarly is a proofreading tool that checks the correctness, clarity, engagement, and delivery of a text. Correctness refers to spelling, grammar, and punctuation, and these issues are highlighted in red. Clarity helps make your writing clear and concise and also highlights when you use passive voice.

Engagement looks at the text and makes it more interesting by giving you synonyms to terms, or helping your text match the overall tone of the article. Delivery works along similar lines as Engagement, but it’s more focused on the reader’s attitude. 

Most of the time, issues with Delivery revolve around technical issues that are commonplace in writing, such as ending sentences with a preposition. By using these four broad categories, an article is assigned a value between 1 and 100. 

You should make sure that after using this tool, you have 0 correctness flags because these are what updates such as Google Panda and to a lesser degree, Google Fred will look at. Grammarly also lets you proofread articles in one of four English localizations, which is useful if you are publishing content for both U.S. and British audiences.

Grammarly

Hemingway Editor

Hemingway is a great tool that looks at the readability of your text. It assigns a reading level to it and highlights passages that are hard or very hard to read, the use of passive voice. When it comes to the use of adverbs or terms that have simpler alternatives, it will highlight these and give you the option to change or omit the word or phrase.

When using this tool, you want to have a score between Grade 5 and Grade 9. It’s ok if it’s a lower grade, but the readability of Grade 10 or higher needs to be simplified. An easy way to do this is by splitting up longer sentences into shorter ones.

Hemingway Editor

Copyscape

This checks for plagiarism, which can negatively impact your search engine ranking with updates such as Google Panda, Google Fred, or Google Pirate. The site allows you to copy-paste the URL, or the text into a search box, and check for any piracy flags. 

This will also highlight quotes that aren’t in quotation marks, so you can fix that before Google’s search algorithm hurts your rank.

Copyscape

SEOBook’s Keyword Density Analyzer

This is a useful site for checking the occurrence rate of words and phrases in an article. If the rate is larger than 2.5%, consider lowering the occurrence rate of that word by using synonyms or using words given by Hummingbird or the TF-IDF feature on Website Auditor.

SEOBook Keyword Density Analyzer

Final Thoughts

It doesn’t matter if you’re the owner of a large website that talks about all things tech, or a part-time blogger, who writes articles about a wide variety of subjects that they find interesting, both can make use of search engine optimization.

By creating original content that is focused on informing the public about something rather than generating a page that’s full of keywords, you’re going to climb up the Google results rankings. This will allow you to share your thoughts, feelings, and opinions with a wide audience.

The post 7 Algorithm Updates Currently Affecting Your SEO first appeared on SEO Optimizers.

]]>
https://seooptimizers.com/blog/7-algorithm-updates-currently-affecting-your-seo/feed/ 0
How To Speed Up Your Site For SEO https://seooptimizers.com/blog/how-to-speed-up-your-site-for-seo/ https://seooptimizers.com/blog/how-to-speed-up-your-site-for-seo/#respond Wed, 15 Mar 2017 17:07:57 +0000 https://seooptimizers.com/?p=35970 Search engines like Google use many thousands of different signals to determine which websites rank highest for each keyword. There are many ways to get your website to rank higher. For example, it’s common for search engine optimization specialists to focus on optimization tactics like building high quality links, or making sure the right keyword…

The post How To Speed Up Your Site For SEO first appeared on SEO Optimizers.

]]>
Speed Up Your Site For SEOSearch engines like Google use many thousands of different signals to determine which websites rank highest for each keyword. There are many ways to get your website to rank higher. For example, it’s common for search engine optimization specialists to focus on optimization tactics like building high quality links, or making sure the right keyword is on the right page of your website. One tactic which can also be used to help improve your search engine rankings is the speed at which your website loads. Google cares more and more about site speed these days, because mobile devices tend to have slower internet connections, which leads to slower-loading sites having higher bounce rates on mobile devices.

“If your website loads faster, you make money faster,” says Max Goldberg, digital marketing specialist. Here are some tips to speed up your website, which help improve your bounce rate, and your ranking on Google:

Use Smaller Images

Text loads very quickly on most websites, and usually even pages with thousands of words of text don’t have trouble loading on mobile devices. However, images load much more slowly than text. It’s common for webmasters to not realize that the images on their websites are sometimes a megabyte or more per image, which can cause your load times to be too long for average users. Ideally, you want to make sure that all of the image files on your website are under 100kb in size, which even mobile 3g connections can load quickly.

Use Less Rich Content

Similar to images, any rich media will slow your site down. It’s a balance in improving the user experience with great content, and hurting the user experience with slow load times. If you’re loading video or audio from your website server, consider reducing the file size of the content as small as possible.

The custom display stands page of Decorative Events is a good example of an image-rich site that loads fast.

Use Faster Servers

No matter what the content is on your website, the speed at which your server delivers your content will have a major impact on your site’s load time. The average website on a shared hosting plan from GoDaddy or Namecheap is inexpensive for a reason — it’s not usually very fast. If you can afford a better hosting plan, ideally using a dedicated virtual private server, you will see fast load times, and lower bounce rates.

Use Google’s Page Speed Insights Tool

Google provides a tool for webmasters to check their load speed compared to other websites call the PageSpeed Insights Tool. You can submit your URL and receive a checklist of items which are negatively impacting your website’s load speed. Take note of the major items, and start improving them one by one. You can also enter your competitor’s websites, and see if they are loading faster than you. If you notice that you have slow loading images, and they do not, and they outrank you — you know what you need to do! Remember that the site speed testers are in offices in Shoreditch and all across the world so get a few speed tests done for more accurate results.

Convert New Posts to AMP

AMP is a new open source standard for mobile devices used by Google, which stands for “Accelerated Mobile Pages”. The AMP standard changes how mobile users view your website’s content. Instead all the content loading, Google indexes its own version of your site, stripping it down to mostly just HTML, ensuring mobile devices can load it more quickly. AMP pages are preferred by Google on mobile searches. If you have a WordPress site, you can deploy AMP pages easily using the AMP plugin.

The post How To Speed Up Your Site For SEO first appeared on SEO Optimizers.

]]>
https://seooptimizers.com/blog/how-to-speed-up-your-site-for-seo/feed/ 0
Are TLDs Really Relevant for SEO? https://seooptimizers.com/blog/are-tlds-really-relevant-for-seo/ https://seooptimizers.com/blog/are-tlds-really-relevant-for-seo/#respond Thu, 03 Mar 2016 15:29:33 +0000 https://seooptimizers.com/?p=1465 Are you still wondering what domain extension to choose for your new blog or your new project? Registrars tend to stress the benefits of a certain TLD over another in terms of SEO, but should you listen to that advice? Actually, NO. There are plenty of myths around the role of TLDs and domain names…

The post Are TLDs Really Relevant for SEO? first appeared on SEO Optimizers.

]]>
Domain TLDs: Relevant for SEO?

Are you still wondering what domain extension to choose for your new blog or your new project?

Registrars tend to stress the benefits of a certain TLD over another in terms of SEO, but should you listen to that advice?

Actually, NO. There are plenty of myths around the role of TLDs and domain names in SEO and I wrote this post to dispel the most known and talked to help you make informed decisions and avoid damage to your branding and SEO.

TLD Doesn’t Make Ranking

It has been debated over and over in the years, but Google has always denied any benefits of a certain TLD over another in terms of SEO.

Last time it was in July 2015, when John Mueller said on Google+ to clear any doubts and misconceptions: “if you spot a domain name on a new TLD that you really like, you’re keen on using it for longer, and understand there’s no magical SEO bonus, then go for it.”

Also, on the Google blog, Mueller adds:

Q: Will a .BRAND TLD be given any more or less weight than a .com?
A: No. Those TLDs will be treated the same as a other gTLDs. They will require the same geotargeting settings and configuration, and they won’t have more weight or influence in the way we crawl, index, or rank URLs.

Your chosen TLD won’t make or miss any future Google rankings for your website. What it really takes for you to rank is to work smart on your content, marketing and networking efforts.

As Neil Patel says in his blog, “The major factor in whether TLDs have an effect on SEO is whether the domain contains keywords. Of course, you and I know that Exact Match Domain names risk getting penalized.”

What really makes the difference, indeed, is keywords in your domain name, regardless of extension. In fact, Google has been penalizing EMDs (exact match domains) since 2012, so you should by all means avoid exact keyword domain names (like cheap-payday-loans.com) and instead focus on branding (think google.com, walmart.com, etc.).

When Does A TLD Count?

The problem may arise when certain TLDs — as well as certain IPs — are blacklisted or look suspicious because of spamming (mostly email spam), but in general the domain extension of your site does not play major role in your SEO efforts. gTLDs and the new TLDs are no exception. As long as you play it white hat and don’t spam, you’re fine to go.

Ensure that your site is on an IP that is not associated with spam and thus is not blacklisted. You can use WhatIsMyIPAddress.com to find out if your site IP is blacklisted.

Beware of free TLDs, however — because these have been massively used for spam and infection (like CO.CC and CZ.CC), Google has these in blacklist so your website would never get a chance to rank. Always research news about your chosen free TLD to see if it makes a viable choice while you wait to buy standard TLD.

The post Are TLDs Really Relevant for SEO? first appeared on SEO Optimizers.

]]>
https://seooptimizers.com/blog/are-tlds-really-relevant-for-seo/feed/ 0
Honesty in SEO Pays Off – A Ghostblogging Case Study https://seooptimizers.com/blog/honesty-in-seo-pays-off-ghostblogging-case-study/ https://seooptimizers.com/blog/honesty-in-seo-pays-off-ghostblogging-case-study/#respond Tue, 16 Feb 2016 14:25:24 +0000 https://seooptimizers.com/?p=1461 Does Ghostblogging Work? Last Update Remember how SEO used to go before Google Webmaster Guidelines? Optimizing a website for search engines was a mere collection of tips and tricks that did the job just fine. You could stuff your page with keyword; you could build a gazillion of low quality backlinks and get rewarded for…

The post Honesty in SEO Pays Off – A Ghostblogging Case Study first appeared on SEO Optimizers.

]]>
ghostblogseo

Does Ghostblogging Work?

Last Update Jun 4, 2024 @ 1:03 pm

Remember how SEO used to go before Google Webmaster Guidelines? Optimizing a website for search engines was a mere collection of tips and tricks that did the job just fine. You could stuff your page with keyword; you could build a gazillion of low quality backlinks and get rewarded for that.

There are webmasters and SEOs who still attempt to ‘optimize’ their websites (or their clients’) using these old tricks, but they end up with a penalty or they don’t get indexed at all.

That’s because they miss out on today’s most important SEO asset: honesty.

I’ve always believed in honesty. I apply it in all things in my life. But I didn’t realize how important this virtue was in SEO until I started to use SEO for work — and no matter how much I may personally dislike Google’s guidelines and defy them whenever I can, I realized I can’t do the same with my clients’ websites.

Being ‘honest’ in Google’s terms means certain things over others and I had to be honest in Google’s terms to make things really work for my clients.

This article is a short case study that takes directly from my 17-months experience as a ghostblogger with SEO and social media duties for a team of blog owners that will remain anonymous in this post, and that I will call ITIPComp (the name is completely fictional).

An initial scenario that required honesty

When the ITIPComp team e-mailed me to inquire about my writing services, they were not just concerned about their rankings, but they also had a urge to build up their blog’s reputation and attract traffic and a human following.

In other words, they needed to:

  • Come up with a specific identity of their blog and work it into a sub-niche to make their blog stand out in the crowd
  • Spread blog and niche related keywords and key-phrases in deep links and around the Web in natural contexts
  • Get their blog known on Social Media
  • Get traffic and rankings from Google.

Setting short, long-term goals on a 100% honest ghostblogging + SEO strategy

The first time the ITIPComp team contacted me, all they needed was a writer. My job was to come up with 3 short articles (500-600 words) a week and to use keywords in a way that would help the blog rank and gain traffic without falling in Google’s bad book.

There was a big problem with that newborn blog, though: it was like every other blog in the same niche. It had no special take, no identity. It was never going to rank like that, and if it did, it wouldn’t be for long and users would have had no interest in choosing this blog over another, perhaps more popular.

So I offered to tailor our ghostblogging + SEO plan on a specific angle, a take that would tell the blog part from any other and would help the team stand out for their humor, vision and honesty.

The ITIPComp team worked together to come up with a logo, a vision and an About Us page that would convey all these traits, then the keywords they wanted to rank for, and topics they wanted to offer to the public. they SEO team previously tried to optimize and I analyzed the number and quality of backlinks and citations her company had around the Web.

How to face the challenge of a little known blog without entering the questionable area of black hat SEO?

With my client, I chose to take small steps and to build links and reputation through quality content writing.

I’m a freelance blogger, and writing was the first service she hired me for before I became her SEO. So, why not?

The plan involved intense content production and the following SEO and Marketing efforts:

  • Interviews through the MyBlogU platform – involved other bloggers with the blog through interviews and citations.
  • Blog Comments – under my ghost (agreed-upon) name or the team name. Comments were consistent with the blog post content and provide value for the owner and their readership to engage with.
  • SEO – all posts were optimized for search engines and we used keywords in titles and hastags on both posts and social.
  • Social Media Updates and Community Involvement – used social media and web communities to post updates on new content available at the blog and to answer users’ questions and help them solve their problems.
  • Content marketing through Social Media and the Kingged web community – user involvement and feedback from all the platforms helped us improve our plan for the growth of the blog.
  • Wise deep linking – we made good use of internal linking to boost rankings and help users find related content.
  • Weekly or Bi-Weekly Communications / Activity Reports – with percentages of growth or decrease in SERPs and in the traffic rank chart. Social activities were also reported.

On the client’s side, the ITIPCorp team worked on some local marketing.

The results: honest work paid off

The ITIPComp blog slowly but steadly increased in worldwide traffic, Google rankings and reputation. No manual actions were ever present in Webmaster Tools. This blog grew while staying clean in Google’s book.

We never hit more than 100-150 unique visits daily, because doing it

  • from scratch
  • without an existing community
  • entirely trick-less

is hard… but not impossible. The blog earned a small but loyal community and some quality backlinks while the ITIPComp team and I worked together. Here is a screenshot from Google Analytics:

1month_2015_statThe peak between Jan 10 and Jan 19 was caused by an interview-based blog post we published on Jan 10. The added social shares from the interviewees caused a boost in traffic, confirming the effectiveness of collaborative posts in helping traffic and backlinks grow.

The ITIPComp team still needs to work on their blog community and on adding guest posts to the marketing mix, but the work done over these 17 months has laid a solid base for future work.

Why does honesty in SEO pay off?

Let’s list a few takeaways from the case study:

  • Google’s guidelines encourage webmasters to use an honest SEO strategy to earn backlinks and rankings: if you want to stay in Google’s good book and improve your rankings without risking a penalty, you should stick to these guidelines.
  • Being honest in Google’s terms doesn’t mean you have to just wait and hope your pages get found, liked and linked to: it takes good on-page SEO and marketing effort to keep your blog alive and thriving.
  • It is possible to build traffic and reputation from scratch: it takes more effort, hard work and the patience to connect with other bloggers in the same niche, to be helpful and friendly, to build a community around your blog and to write content that makes the difference in the reader’s life.
  • A genuine, truly helpful social media and commenting activity complements SEO and both complement other forms of marketing (online and offline).
  • Collaboration with other bloggers, interviews and responsivity to feedback can mean a great deal to the growth of your blog.

How did you build solid traffic, rankings and a community around your blog? Share your story.

The post Honesty in SEO Pays Off – A Ghostblogging Case Study first appeared on SEO Optimizers.

]]>
https://seooptimizers.com/blog/honesty-in-seo-pays-off-ghostblogging-case-study/feed/ 0
Google Says Less Reconsideration Chances For Repeated Violations https://seooptimizers.com/blog/google-says-less-reconsideration-chances-for-repeated-violations/ https://seooptimizers.com/blog/google-says-less-reconsideration-chances-for-repeated-violations/#respond Thu, 04 Feb 2016 14:00:18 +0000 https://seooptimizers.com/?p=1363 On September 18h, 2015, a Google Webmaster Blog post came up to shake webmasters and SEOs and give everybody with a website some food for thought — Google is no longer granting an easy and painless reconsideration to those websites that violate Webmaster Guidelines again and again after they have been successfully reconsidered. To quote…

The post Google Says Less Reconsideration Chances For Repeated Violations first appeared on SEO Optimizers.

]]>
Google Penalty

Photo Credit: http://www.candidwriter.com (cc)

On September 18h, 2015, a Google Webmaster Blog post came up to shake webmasters and SEOs and give everybody with a website some food for thought — Google is no longer granting an easy and painless reconsideration to those websites that violate Webmaster Guidelines again and again after they have been successfully reconsidered.

To quote the blog post: “Such repeated violations may make a successful reconsideration process more difficult to achieve. Especially when the repeated violation is done with a clear intention to spam, further action may be taken on the site.”

To be honest, I always suspected that, but the fact that Google’s team is making this position public might mean the case has not been rare after all.

What Marketing Experts Think About It

I asked the three Marketing experts I interviewed for the post on HTTPS and blogs what their reaction to this news was.

Here are the responses:

Cormac Reynolds, company director at MyOnlineMarketer.co.uk:

Cormac Reynolds

Cormac Reynolds

I think it’s fair enough. You can’t just keep spamming and respamming. Yes, we can all push the boundaries and get hit with a manual penalty, but to do the same thing repeatedly doesn’t stack up well. Additionally, it takes up the manual reviewing time that could be spent on other legit businesses reviews trying to make the effort to go down a straight line.

David Leonhardt, president at THGM Writers:

David Leonhardt

David Leonhardt

Wow!  this is shocking news.  This implies that for the past decade or more, Google has been repeatedly penalizing sites and letting them offend again with impunity (well, with no more penalty than for their first offence, which is a lot like impunity wearing a mask of contrition).  I had always assumed that Google has a list of “bad boys” that they keep in a special dungeon and periodically send out a special squad of paintball hitmen.

Lukasz Zelezny, head of organic acquisition at Zelezny.uk:

Lukasz Zelezny

Lukasz Zelezny

Google has never professed to allow black hat SEO, so I’m quite surprised that so many people were shocked by this announcement.  Yes, every webmaster will make mistakes from time to time – just look at the whole guest posting fiasco of 2013 – however when a site repeatedly gets penalised over and over again, I think it’s clear that the site owner either really has no clue what he’s doing, or that he’s spamming the search results.  In the former case, you’d assume that after one Google penalty, you’d start getting clued up, so I believe those that see numerous penalties are aware of what they are doing and haven’t been too worried about it as the short term traffic has made it worth their while.

Webmasters who try to adhere to Google’s guidelines should have nothing to worry about.  But if you do have cause for concern, it’s worth subscribing to the Google blog and staying up to date with what the search giant says.  You may also want to hire someone in the SEO industry to perform an audit of your site to ensure you aren’t breaking any rules.  If you’re not breaking the Google guidelines on purpose, though, you shouldn’t have anything to worry about.  The new penalty should be good news for those of us who stick to the rules.

A Final Thought (And A Takeaway)

It may be easy to fall in the trap to game Google again and again, but while a website might escape an algorithmic penalty, it’s unlikely it would escape a manual one — and manual actions are anything but rare.

Unless you don’t care about Google’s opinion of your website anymore and you prefer other traffic sources and marketing methods, to get caught violating Guidelines a second or third or further time can backfire — long term.

When you find out that Google’s Guidelines are not for you, making an effort to change your traffic and lead generation plan is a more effective and less stressing option than to continue playing hide-and-seek with Google.

What’s your reaction to this news? How do you deal with Google penalties (if you do)?

The post Google Says Less Reconsideration Chances For Repeated Violations first appeared on SEO Optimizers.

]]>
https://seooptimizers.com/blog/google-says-less-reconsideration-chances-for-repeated-violations/feed/ 0
4 Ways to to Stay on Top of SEO https://seooptimizers.com/blog/4-ways-to-to-stay-on-top-of-seo-in-2016/ https://seooptimizers.com/blog/4-ways-to-to-stay-on-top-of-seo-in-2016/#respond Mon, 25 Jan 2016 19:06:34 +0000 https://seooptimizers.com/?p=1453 Search Engine Optimization Matters Search Engine Optimization (SEO) can make or break a new business, and has a huge impact on whether a new site is commercial success. The general concept, for the uninitiated, is that search engines use criteria from your site, plug them into an algorithm, and then use the result to decide…

The post 4 Ways to to Stay on Top of SEO first appeared on SEO Optimizers.

]]>
2015 SEO Techniques

Search Engine Optimization Matters

Search Engine Optimization (SEO) can make or break a new business, and has a huge impact on whether a new site is commercial success. The general concept, for the uninitiated, is that search engines use criteria from your site, plug them into an algorithm, and then use the result to decide if your site contains the information people are looking for.

In principle, it’s very straightforward. In practical terms, it’s a real challenge because search engines tend to be very secretive about what those criteria actually are. Because of this, there is a whole industry of modern day code breakers attempting to reverse engineer Google’s algorithms day in and day out. Placing higher in the search results is big money, so, sooner or later, the inner workings of the black box are puzzled out.

Booms and Busts

People being people, about a tenth of a second after someone figures out what the rules are, everyone starts gaming them, and optimizing their pages to appease the search engines and inflate their ranking for various search terms irrespective of actual content value.

This is, as we’ve said, a huge industry.

Of course, once everyone starts gaming the system, spam begins to sprout up everywhere, and the overall quality of the search results begins to drop.

SEO Rules Change

The big search engine companies don’t take this lying down, however, no; Google and Bing are constantly updating and changing their algorithms to weed out spam, prevent us from gaming the system, and, also, to improve the predictive ability of their search engines.

That’s why, a few years back, SEO was mostly a matter of keyword stuffing (but not stuffing too much) but 2015 was the year of long-from content and long-tail keywords.

What 2016 will be remains to be seen, but Google has already been busy shaking up the rankings in January, so it’s likely there’s going to be more to see in the months ahead.

What’s New in 2016

The biggest SEO change for developers this year has been Google’s update of their core algorithm. Everything they’ve done and changed isn’t apparent just yet, but there’s not too much mystery, because changes aren’t incorporated into Google’s core algorithm until it’s a fair bet they don’t need to be watched constantly to avoid disaster.

In other words, Google’s probably just taken their big changes from 2015 (and before) and set them free to run without supervision. In other words, Panda has been moved to the core, and no one’s completely sure what that means, but it’s probably important. It seems likely that various mobile search ranking preferences have also been subsumed, but it’s too early to tell.

All of which sort of leads to the question, “If everyone is unsure what’s going on, how am I supposed to keep the sites I manage optimized?”

Well, there are a few things you can do to stay ahead of the curve.

How to Stay on Top of SEO Trends

1. Install Plugins that do most the work for you.

If you’re using WordPress, and you probably should be, unless you’ve got a custom coded site, there are a number of plugins that do much of the heavy SEO lifting for you by automating many tasks and telling you if you’re making mistakes. They update regularly to reflect consensus changes in SEO practices.

The big one is Yoast.

2. Follow a few search engine blogs.

When all else fails, turn to the experts. The thing is, you need to be sure you’re finding sources who stay on top of developments. An expert who knew absolutely everything about SEO in 2014, but hasn’t kept up on things, is the next best thing to useless in 2016.

You don’t have to turn yourself into an expert, but very rarely is knowing more than you need to a disadvantage in life. On top of that, the evolving means and methods of how we find, collate, and deliver information is one of the most salient issues of modern civilization. If the internet is a lever moving the world, then search engines are the fulcrum on which it rests. So find a blog and follow it.

3. Learn the rules of the system.

Google makes a real effort to tell you what not to do, so you don’t get penalized. Make sure you check in on that regularly, or, if this is your first site, learn them. Since they’re setting the rules, even if the exact process might not be clear, following their rules should keep your site from being penalized.

The process is more than worth the time invested, and the will probably help you create better content—which is the next point on this list.

4. Create Great Content

While we don’t ever know exactly what’s going on inside the search engine algorithms, we know exactly what the goal of those search engines happens to be: Delivering users to the best content on the subject they’re looking for.

In this, at least, nothing has changed since the beginning. The take-home, then, is that you should create content with value to the people who find it. If your content is the best answer to a search query then whatever changes Google makes will be with the goal of bringing people to your content (you know, in a large and impersonal sense).

If you’re working to create great content that users will find helpful, well, you’re probably going to make content that’s pretty close to optimized, whatever minor tweaks and changes come your way in the months ahead.

The post 4 Ways to to Stay on Top of SEO first appeared on SEO Optimizers.

]]>
https://seooptimizers.com/blog/4-ways-to-to-stay-on-top-of-seo-in-2016/feed/ 0
Does Guest Blogging Help With SEO Rankings? https://seooptimizers.com/blog/does-guest-blogging-help-with-seo-rankings/ https://seooptimizers.com/blog/does-guest-blogging-help-with-seo-rankings/#respond Tue, 03 Nov 2015 17:00:17 +0000 https://seooptimizers.com/?p=1340 I know, I know— controversial topic after Matt Cutt’s blog post shunning guest blogging for SEO in 2014. Even more controversial after the mass penalty on MyBlogGuest.com and the “all nofollow” reaction from eConsultancy that followed. Guest blogging really seems like a gray area to find your marketing and link building efforts in right now.…

The post Does Guest Blogging Help With SEO Rankings? first appeared on SEO Optimizers.

]]>
I know, I know— controversial topic after Matt Cutt’s blog post shunning guest blogging for SEO in 2014.

Even more controversial after the mass penalty on MyBlogGuest.com and the “all nofollow” reaction from eConsultancy that followed. Guest blogging really seems like a gray area to find your marketing and link building efforts in right now.

But it’s a question everybody keeps asking: “Does guest blogging still help with SEO?”

The answer is neither yes or no, but actually slightly more complicated.

I got two writers to share their ideas for this post. Their insight makes it clear that things have changed, but not so much to the smart (and honest) business.

Read on and take away.

Guest Blogging Doesn’t Help With SEO Rankings As It Used To…

Deborah Anderson from Social Web Cafe recalls when it was easier to rank with guest blogging and why it doesn’t work as it used to:

Previously, there was some SEO benefit with guest blogging because of the “dofollow” links.  Of course, there is no such thing as a dofollow link.  What it really is is the absence of the “nofollow” rel tag connected to the link.  This told Google to follow the link (dofollow) and that provided SEO benefit.

What happened is that some people were putting up very inferior “crap” articles and the search results (SERPs) were reflecting poor quality content.  Therefore, Google singlehandedly put a stop to that benefit, to stop people from putting up poor quality articles only for the purpose of the SEO benefit.  That move hurt the rest of us who didn’t mind the SEO benefit but were guest blogging for the other benefits.

What I just described is the direct benefit.  The indirect benefits of guest blogging still exist.  Those include branding, exposure, building the writing portfolio, networking, etc.  Also, there are some sites that do not use the nofollow tag (in other words, dofollow), but those are becoming fewer and fewer.  What happened back in March of 2014, to MyBlogGuest’s clients (basically all of them) resulted in everyone changing their links, site-wide, to nofollow.  Those who did not remained under penalty from Google.  That caused other sites to also change their links to nofollow so that they would not be penalized by Google.

It was a brilliant move by Google, to inflict fear across the globe, for publishers and writers on the Internet.  In one fell swoop they were able to make a statement about the type of content they want to show in their search engine.  We, as consumers of Google products, are to blame, as well, because we gave them the power to be the top dog for search.   And, what can I say.  I love their calendar and I am the “hangout queen” using the Google hangout product.

There are still indirect SEO benefits to guest blogging (i.e. a viral article that happens to do well in the search rankings), but it is not the SEO benefit that it used to be a couple of years ago.  If you are a writer, your best bet is to write for other reasons and then count the SEO benefits as icing on the cake.  If you are a publisher, your best bet is to nofollow the links that are contained in the guest articles that you receive.

It follows that guest blogging as a way to build links and help with SEO is no longer a viable strategy, even though guest blogging always helped and still helps with your branding efforts and it definitely helps to pull links from the Web in indirect ways, like Deborah suggests, which the next paragraph is all about.

BUT You Can Still Guest Blog To Indirectly Help Your SEO Rankings

Not all is lost with the “guest blogging penalty”. Here is what David Leonhardt from THGM Ghostwriter Service has to say about it:

Anything that spreads your name is good for SEO.  Every time you put your name out there, there is a greater chance that people will look for your website, find something they really like and link to it or share it on social media, and that’s good for SEO.

If there is a hyperlink, even a NoFollow one, that serves the same purpose.  If the link is DoFollow, even better.

The value of guest blogging for SEO depends greatly on the quality of the content on your website.  If lots of people read your amazing guest post, then rush to see what’s on your website, only to ding boring stuff, forget earning links and social shares from them.

When you get nofollow links to your blog posts and articles, users will still find them and link back to them as helpful resources in their posts (editorially) if they deem your content trustworthy and authoritative. That is what I do here at Bosmol as a writer anytime I stumble upon an amazing resource that fits into a post I’m writing— I go ahead and link it, whether that link was originally nofollowed or not.

Continue to guest blog and let users find you, if not search engines. In the end, like the say “all chickens come home to roost”, you will still get backlinks that are beneficial for your SEO efforts.

Do YOU guest blog? Has your opinion of guest blogging changed over time or are you still a fan?

Share in the comments below. 🙂

The post Does Guest Blogging Help With SEO Rankings? first appeared on SEO Optimizers.

]]>
https://seooptimizers.com/blog/does-guest-blogging-help-with-seo-rankings/feed/ 0
Best Internet Marketing Blogs https://seooptimizers.com/blog/best-internet-marketing-blogs/ https://seooptimizers.com/blog/best-internet-marketing-blogs/#respond Tue, 06 Jan 2015 18:00:51 +0000 https://seooptimizers.com/?p=1115 Why Local SEO is Important Last Update When you first walk through the Internet Marketing door, you’re bombarded with so much information it can become a little overwhelming. Sometimes it can takes weeks or even months to figure out who is giving great practical advice, to the wannabes who are just in it to take…

The post Best Internet Marketing Blogs first appeared on SEO Optimizers.

]]>
Why Local SEO is Important

Last Update Jun 4, 2024 @ 1:06 pm

When you first walk through the Internet Marketing door, you’re bombarded with so much information it can become a little overwhelming. Sometimes it can takes weeks or even months to figure out who is giving great practical advice, to the wannabes who are just in it to take your money. Whether you’re just starting out or would like to improve on your Internet Marketing knowledge, we have listed 10 well respected Internet Marketing blogs you must follow.

1. Moz
If you don’t already know about Moz, thank your lucky stars you came across our site. Moz is considered the biggest authority site relating to all things SEO. Over the last few years, Moz has created an unlimited supply of fresh information for digital marketers to improve on their SEO strategies. A must follow site for any Internet Marketer.

Website: http://moz.com/blog
Twitter: https://twitter.com/moz
Best for: Learning SEO

2. Social Media Examiner
Whether it’s Facebook, Twitter or any other social media site, everything you could possible want to know is on Social Media Examiner. This blog focuses solely on social media marketing, helping you understand the best way to use paid ads, providing case studies and tips and tricks you won’t find anywhere else on the Internet.

Website: http://www.socialmediaexaminer.com
Twitter: https://twitter.com/smexaminer
Best for: Understanding social media marketing

3. John Chow
John Chow is the poster boy for the blogging scene after he took his blog from $0 to making well over $40,000 a month in less than two years. The impressive part? He spent less than 2 hours a day working on it. Looking to start a blog and make money? Look no further than John Chow.

Website: http://www.johnchow.com
Twitter https://twitter.com/johnchow
Best for: Blogging

4. Niche Pursuits
Want to build your own business online but don’t know where to start? Follow Niche Pursuits and you will have more ideas than you know what to do with. From buying and selling websites, starting your own niche business and understanding SEO, Niche Pursuits is the site to follow for all future online business owners.

Website: http://www.nichepursuits.com/
Twitter https://twitter.com/nichepursuits
Best for: Creating niche websites

5. Mailchimp
Every Internet Marketer will tell you the money is in the list! They are right so you better start following a blog that tells you how to create kickass newsletters. Mailchimp is the go to website for all things email marketing, they constantly put out new blogs showing users how to make the most of email marketing.

Website: http://blog.mailchimp.com/
Twitter https://twitter.com/mailchimp
Best for: Email marketing

6. Digital Marketer
Head over to Digital Marketer for the most up-to-date tips on how to drive more sales, increase your conversions and boost social engagement overnight. What makes Digital Marketer so unique is their blogs are written by actual Internet Marketers, who run their own online businesses rather than ‘researchers’.

Website: http://www.digitalmarketer.com/blog/
Twitter https://twitter.com/digitalmktr
Best for: Getting information on how to increase sales and conversions for your websites

7. MarketingProfs
MarketingProfs probably have the biggest online library of amazing reports and case studies relating to writing content and digital marketing. They are constantly releasing new blog posts, market studies and guides to help you plan your perfect content strategy. Want to know what works? Head on over to MarketingProfs.

Website: http://www.marketingprofs.com/marketing/library
Twitter: https://twitter.com/marketingprofs
Best for: Content marketing information

8. Occam’s Razor
Understanding metrics, results and leveraging them to improve your online strategy is what Occam’s Razor is all about. Hundreds of blog posts, videos and podcasts are uploaded regularly to help you better understand the results of your Facebook ad, website visitors or why your website is not performing the way you’d like.

Website: http://www.kaushik.net/avinash/
Twitter https://twitter.com/avinash
Best for: Understanding online analytics

9. Matt Cutts
Not heard of the name before? Well you better get used to it. Matt Cutts is currently head of Google’s Webspam team, this is the department that controls how content is ranked in Google. Follow Matt Cutts to get an inside peak into Google’s soul. He may not be an Internet Marketer, but his advice and knowledge is invaluable.

Website: https://www.mattcutts.com/blog/
Twitter: https://twitter.com/mattcutts
Best for: Understanding Google

10. Bosmol
Get the latest news related to social media marketing as it changes on a daily basis. New sites arise making it difficult to hone in and focus your marketing efforts on one channel. Bosmol.com breaks it down and lets you know how to find the right audience for your products and services.

Website: http://bosmol.com
Twitter: https://twitter.com/bosmol
Best for: Social media marketing

11. Socialwatch.co
Last but not least on our impressive list is SocialWatch. They focus solely on how to optimize every aspect of your social media feed, from the way it looks, how fast it loads to increasing conversions. If you only follow one website optimization blog, make sure it’s Socialwatch.co.

Website: http://rich-page.com/
Twitter: https://twitter.com/richpage
Best for: Website optimization

And there you have it ladies and gentlemen, 10 world-class Internet Marketing blogs in a plethora of online niches. Whatever your goals are, the blogs above will provide you with a suitable framework to get you on the right path to making you a successful Internet Marketer.

The post Best Internet Marketing Blogs first appeared on SEO Optimizers.

]]>
https://seooptimizers.com/blog/best-internet-marketing-blogs/feed/ 0
W3C Validation Not Part Of Google Search Engine Ranking Factor https://seooptimizers.com/blog/w3c-validation-not-part-of-google-search-engine-ranking-factor/ https://seooptimizers.com/blog/w3c-validation-not-part-of-google-search-engine-ranking-factor/#respond Mon, 22 Feb 2010 19:16:33 +0000 https://seooptimizers.com/?p=981 There have been many arguments as to whether or not Google search engine ranking factors look for a clean code in a website. The W3C Schools have brought forth a very popular and widely used tool, called the W3C Validation and the CSS Validation. Basically, both these tools look at your HTML and CSS Style…

The post W3C Validation Not Part Of Google Search Engine Ranking Factor first appeared on SEO Optimizers.

]]>
  • There have been many arguments as to whether or not Google search engine ranking factors look for a clean code in a website. The W3C Schools have brought forth a very popular and widely used tool, called the W3C Validation and the CSS Validation. Basically, both these tools look at your HTML and CSS Style Sheet that code your website for any errors that may arise. This free tool will scan all of your code to make sure it is compliant with current web standards and free or any errors.
    • Recently Matt Cutts, Google software engineer, has said again that the W3C Validation does not affect search engine rankings. This is not the first time he has said this and will not be that last as many people are convinced that W3C is a factor in search engine rankings. Many people are convinced that having a clean coded website will increase search engine rankings. This is in fact untrue. A clean code does help the search engine spiders read and understand your website easier. A clean coded website will guide the spider to the important places on your site, without getting jumbled up in a web of unnecessary coding.
    • The main reason that W3C Validation is not used is because Google is concerned with browser compatibilities. Because a website shows up looking perfect in Internet Explorer does not mean it will look the same in Mozilla Firefox, on mobile phones, on web TV, etc. This is a big issue for many website developers as one site can look great on one computer, and look horrible on another computer. This is Google’s main issue with the W3C Validation. Just because a website passes the W3C Validation test does not mean that it will be compatible in all browsers. This is why Google does not factor this into their ranking.
    • Another reason that Google does not validate websites is because of the time it would take to validate each individual page. Internet users want everything to happen instantaneously. They do not want to wait around for slow loading websites, when there are another million sites out there with the similar information. Therefore, Google eliminated the use of validating websites, to increase website load time and user appreciation for the speed and agility of their search engine. This makes sense, as larger websites would be slowed down by the validation process, causing unhappy web users.
    • Although it seems glum that there will be any universal web standards anytime soon, we can all keep our hopes up. Universal web standards will make web designers and developers lives so much easier. As of now different web browsers will show websites differently. On Mozilla-Firefox your website may look perfect, but on Internet Explorer the website may be off centered. This causes many headaches and tiresome trial and error to perfect websites.
    • There have been efforts made to create universal web standards, but these have yet to be widely adopted. Website designers and developers are at heeds with one another because of the issues caused by web compatibility. Until universal standards are adopted, there will be angry people working behind the scenes to create websites that look good in every browser. This becomes a time intensive, tedious project that can be eliminated once universal standard have been adopted.
    • The major roadblocks to having universal web standards are the browsers themselves. They are not compatible with one another causing websites to look good in one browser and horrible in another. Designers and programmers are frustrated by these inconsistencies between browsers, making their jobs nearly impossible. They do not know which website standards to use when building or coding a new website.
    • Matt Cutts even admits that he wishes Google did validate webpage’s, but the reality is that a lot of sites on the web, even popular sites do not pass the validation. This would mean that these pages would be dropped from their rankings, or need to redesign and recode their site. Both outcomes can have daunting effects on the website. If the code needs to be changes, then the site may need to change to adjust to the new coding.
    • Now you may be wondering why so many websites have a link at the bottom saying W3C Valid. They are under the impression that this adds value to the page when customers see it. The big reason to validate your website is too look for human errors that you may have overlooked when building your site. The W3C Validation will show you broken links and many other important coding factors that can negatively impact the way humans see your site.
    • In conclusion, it is a good idea to have the W3C validation on your site to check for errors and clean up unnecessary coding. Who knows, in the next few months or years this may become an important ranking factor for the search engines and you can be one step ahead.

    The post W3C Validation Not Part Of Google Search Engine Ranking Factor first appeared on SEO Optimizers.

    ]]>
    https://seooptimizers.com/blog/w3c-validation-not-part-of-google-search-engine-ranking-factor/feed/ 0
    November 2009 Google Page Rank and Algorithm Update https://seooptimizers.com/blog/november-2009-google-page-rank-and-algorithm-update/ https://seooptimizers.com/blog/november-2009-google-page-rank-and-algorithm-update/#respond Sun, 29 Nov 2009 16:24:38 +0000 https://seooptimizers.com/?p=976 Google Algorithm Updates Last Update Google has once again updated their search engine algorithm and Page Rank. This has cause great joy for many webmasters, while others are scrambling to find out what happened to their websites page rank. Goggles newest page rank update, occurring late October/ early November 2009 seems to favor new pages.…

    The post November 2009 Google Page Rank and Algorithm Update first appeared on SEO Optimizers.

    ]]>
    Google Algorithm Updates

    Last Update Jun 4, 2024 @ 1:07 pm

    • Google has once again updated their search engine algorithm and Page Rank. This has cause great joy for many webmasters, while others are scrambling to find out what happened to their websites page rank. Goggles newest page rank update, occurring late October/ early November 2009 seems to favor new pages. New pages are pages that have recently been added to your website, or have undergone major changes in content. Existing pages with no updates or slight modifications had a loss in page rank, while pages that are new have passed on the link juice from the homepage to deeper pages. This is consistent in many sites I personally manage for myself and existing search engine optimization clients.

     

    • This just proves that Google loves to see fresh, new, original, updated content on webpage’s. Stale, old recycled information should be changed frequently on your website. Not doing so means that your pages will suffer from lower search engine rankings than desired. Even if the URL has been changed, but the content has not been changed there will be no PR for the page. This is just an observation upon many of new, recently modified or updated websites.
      Simply changing the layout, template, images, etc will not cut it. This is because the search engine spiders cannot read or comprehend these. The search engines can easily read text, but images and website layouts are more challenging. Alt tags, alternative text, allows for the search engines to get a few word sample of what the image is. This can drastically improve a website’s search engine optimization. Therefore, changing your template will not suddenly push your page rank through the roof. The textual content is the key factor to increasing website page rank in Goggles eyes.

     

    • In order to improve your Google Page Rank you must really change your website content. This does not mean simply adding a few lines of text, reordering around your content, or using auto generated content. Major changes to the text should be made, even if this means starting from scratch or rewriting all of your existing copy. This is a daunting task and cannot be performed on every website. E-commerce sites would have trouble updating their product descriptions multiple times throughout the year. In this case, occasionally changing the content or adding slight modifications can help preserve the page rank.

     

    • The new website copy should have the proper research done before implementing the changes. What this means is that you should search out profitable keywords for your page. Profitable keywords are keywords that have a lot of searches, yet little competition. This means going after long-tail, niche keywords. Once these keywords have been identified, they should be inserted into the websites title, meta description, and meta keywords, header tags, alt tags, and the content. Make sure not to overdo it as the search engines will view this as spam, negatively impacting your website, or even getting you permanently banned from the search engines.

     

    • A websites should also have a flat navigational hierarchy. This means that your homepage, which typically receives the highest Page Rank from Google, should like to the most important pages in your website. The more pages the homepage links to on your website, the higher those pages will rank in the search engine. This is called passing on link juice from on webpage to another. Imagine a tree table, in which there is one main site linking to many interior pages. The other pages on your website that are not linked to the homepage will not receive as much link juice.

     

    • In essence to avoid losing page rank your website should update its content 3–4 times a year. This is because Google performs major page rank and algorithm updates 3-4 times a year. The updates usually occur around the same time, so expect to see another huge update in January of 2010. So start writing, or hire a copywriter to help write SEO friendly content for your website.

    The post November 2009 Google Page Rank and Algorithm Update first appeared on SEO Optimizers.

    ]]>
    https://seooptimizers.com/blog/november-2009-google-page-rank-and-algorithm-update/feed/ 0
    Achieving Top Search Rankings in Microsoft New Decision Engine Bing DEO / SEO https://seooptimizers.com/blog/achieving-top-search-rankings-in-microsoft-new-decision-engine-bing-deo-seo/ https://seooptimizers.com/blog/achieving-top-search-rankings-in-microsoft-new-decision-engine-bing-deo-seo/#respond Sun, 07 Jun 2009 13:45:06 +0000 https://seooptimizers.com/?p=987 Can Microsoft compete with the search engine giant, Google? It may have looked doubtful before when looking at Microsoft’s Live Search, but things are looking good with the new release of Bing, Microsoft’s new search engine, aka “decision engine”. It was released on June 3, 2009 and has an extensive marketing campaign in place, estimated…

    The post Achieving Top Search Rankings in Microsoft New Decision Engine Bing DEO / SEO first appeared on SEO Optimizers.

    ]]>
    Can Microsoft compete with the search engine giant, Google? It may have looked doubtful before when looking at Microsoft’s Live Search, but things are looking good with the new release of Bing, Microsoft’s new search engine, aka “decision engine”. It was released on June 3, 2009 and has an extensive marketing campaign in place, estimated to be at nearly $100 million. Major television ads are running promoting Bing.

    Bing’s simple interface with a colorful background mimics that of Google, but with a more ambiance t it. The background is always changing with hot spots that are clickable. Luckily, you can always go back to a past image and find the hot spots, in case you saw something important there. It does not make much sense to have these hot spots and they can be quite confusing, especially since they are constantly changing. There is no warning of when and where these hot spots are until you scroll over them. Then a question related to the image appears with a link to click on, usually answering the question proposed.

    When comparing Bing to Google there are more similarities than dissimilarities. For starters they both have a simple interface with a large search bar in the middle of the page. They both display Images, Videos, Shopping, News, and Maps on the homepage. The difference is Google also has Gmail and Bing has travel. The big difference is that Bing has changing images, where as Google has a changing logo against a blank interface.

    Bing has decided to market itself as a decision engine, rather than a search engine. What this means is that Bing incorporates e-commerce websites search bar into a fully functional search engine. Microsoft believes that searchers are ready to move beyond the search stage and Bing will help them make better decisions. The search engine results are categorized to make it easier to find results.

    The decision engine has been creating tons of search engines ranking for many of my clients with a lower bounce rate than the other major search engines. Does this mean that Microsoft has succeeded with an advanced algorithm that is superior to the major three’s? Or is it a fluke that the bounce rate is visibly lower than the other search engines.

    The big question is how do we optimize our websites for Bing? We all want to be ranked at the top of a new search engine, which has the potential to take on the other major search engines. It is simple, do what you do for the other major search engines. From my observations domain age plays a big role. Bing wants to see websites that are established and have been around for a long period of time.

    The decision engine, Bing, seems to like websites with tons of original content on the landing pages. Make sure your page titles are keyword rich and appropriate for the subject matter. Bing loves titles with keywords searchers are using. Make sure to have a good, unique title for all your websites pages. Unlike other search engines, linking out seems to be favored. This means linking to other sites from your own site is good for ranking. This may show to Bing that your site shares useful information with its users. This is not to say reciprocal linking is good, but linking to sites that your users may deem valuable is a good idea.

    Sign up for an account with Bing and manage your analytics and start a pay per click campaign with them. Their PPC rates are significantly lower than other PPC campaigns because there is not as much competition and keyword dilution occurring. In a few months to a year PPC costs will begin to mimic those of the other search engines, but for now the prices are superb.

    The post Achieving Top Search Rankings in Microsoft New Decision Engine Bing DEO / SEO first appeared on SEO Optimizers.

    ]]>
    https://seooptimizers.com/blog/achieving-top-search-rankings-in-microsoft-new-decision-engine-bing-deo-seo/feed/ 0