Post by sumiseo558899 on Nov 7, 2024 8:59:57 GMT
"Our site has already undergone one-time optimization, I don't need subscription SEO"
"Why do we need to rebuild the semantic core? We built it a year ago, and we are happy with it!"
“Why do plans/reports include items such as ‘Title adjustment’, ‘Text adjustment’, ‘Internal link adjustment’?
"Is it really impossible to do it right the first time?"
Such questions can often be heard from content writing service customers and potential clients. The fact is that, as in any field, search engine promotion has its own specifics.
SEO is a complex process, the effect of the work is always delayed, and the final result is influenced by many external factors. Working in conditions of uncertainty, we need to find the best way to complete tasks. It is in such circumstances that an iterative approach is the best way out.
An iterative approach (English iteration - “repetition”) is the execution of work in parallel with continuous analysis of the results obtained and adjustment of previous stages of work.
In this article we will tell you why you can’t just take and do everything at once with a lifetime guarantee of 100% results.
Main SEO works
First, let's look at what kind of work SEO optimization includes in general. In order to successfully adapt a website to user demand, you need to study the features of this demand, analyze competitors, evaluate the problems of the website and its advantages over competitors. Then you need to find growth points, implement and track changes, analyze the result.
In short, SEO optimization work includes:
identifying and eliminating technical problems and shortcomings that may affect the ranking of the site;
formation of a semantic core and clustering of collected queries;
refinement of the site structure based on collected and clustered semantics;
optimization of tags, text and graphic content, internal linking;
work to improve the presentation of the site in search results;
work to improve user interaction with the site;
external optimization work;
scaling up the achieved results (regional promotion and promotion in the CIS countries);
tracking innovations and prompt implementation of relevant improvements.
As you have already noticed, all the work is multifaceted. Some of it is working through standard checklists, others are analyzing the current situation and taking corrective actions. And a separate pool of work is reduced to “keeping your finger on the pulse” and responding promptly to any changes. The sequence of work is determined by the priority of the tasks.
How priorities are distributed
There is no universal recipe, as all projects are different. It is important to find out which changes need to be implemented first, and which can be done after the effect of what has been done previously is clear.
What is done first:
website audit and subject matter assessment,
fixing critical issues that clearly affect ranking: these could be any problems with indexing, sanctions from search engines for any violations, and so on.
Measures to solve such problems must be taken immediately, otherwise all further work will be meaningless.
What is done second:
semantic collection and clustering;
structural improvements;
template tag optimization;
working with text content;
correction of errors and shortcomings on the site (what was not critical, but still requires attention);
external optimization.
It would seem that there is nothing complicated, it is enough to perform all the work step by step. However, there are certain nuances here. The sequence of works may differ depending on the subject of the site and the volume of the semantic core.
Semantic collection and clustering
Collecting a complete semantic core is the foundation for further actions. It is the core that determines the sequence of work.
Let's consider what options are possible:
Option 1
Narrow subject matter or medium-sized semantics. In this case, collecting a full semantic core takes from 1-2 days to a week. Work is carried out according to the usual plan.
Example – lift rental services. Narrow subject matter, limited supply, local business.
Option 2
Subject matter with a huge semantic core. For example, a product catalog with different name formulations, product groups with different characteristics and properties.
An example is an online store selling commercial equipment, the range of which includes automation equipment (payment terminals, POS systems), weighing equipment (from the simplest household spring balances to industrial crane scales), industrial furniture (metal cabinets, racks and safes) and many other product groups.
In this case, the complete collection of semantics can take several weeks, and in some cases - more than a month. And what to do with optimization? Wait until the complete semantics is collected? It is optimal to build the work on the principle from the general to the specific. First, automation, then - point elaboration by sections:
Template semantics + template optimization.
Analysis of successful competitive sites.
Full individual semantics and super-clustering (i.e. splitting clusters into smaller ones, contrary to clustering by TOP, for example, by semantic proximity. Allows for more precise optimization of pages for queries).
Option 3
A topic in which there is demand for several main queries and a huge number of micro-frequency queries. Micro-frequency means that each individual query is rarely searched for, as a result of which the data does not fall into the monthly statistics of Yandex.Wordstat . But there are a lot of such queries (tens of thousands) and in general they can potentially bring a lot of traffic. In this case, we do not have enough information about which queries are popular with users and which are unpopular.
An example of such a topic is auto parts, where there are several main keys “auto parts/parts + buy/price/toponym/brand/brand+model” and a huge number of requests with VIN number + buy/toponym and so on.
Manual work makes sense only for general queries auto parts/spare parts. It is pointless to collect a semantic core for specific parts; it is enough to define general patterns and iteratively work out templates, and not individually for each key/group of keys with a VIN number. All adjustments to the inclusion of keywords in the page code are made in a cycle:
do –> track the result –> make adjustments –> track the result again
The assembled core determines which specific formulations should be used on the site (in tags, texts, product names and properties), and which ones will be useless in terms of attracting users.
Clustering allows you to understand which queries can be promoted together on one page, and which queries need to be promoted on different pages. It also helps to determine which of them are not suitable for promotion on this site at all (for example, if the queries are informational or relate to another topic, which is not always obvious from the wording).
Example
An information portal on medical topics decided to place an article about asthenia to attract traffic. The question arose: what is the best way to place the information? Publish it as one long text (5000 – 7000 characters) or split it into several short articles for different search queries ("asthenia in children", "asthenia in women", "asthenia how to fight"). Demand analysis and clustering led to the following conclusions:
In this example, we can separate “Asthenia in Children” and “Asthenia in Men” into separate articles, since these key queries are defined as separate independent clusters.
fcc60d8863346d161d2b43d030df9428.png
The wording "asthenia how to fight" is not popular among users. Much more often they ask in the wording "how to treat". It is also worth using the words "how to get rid of", "how to cope". This should be taken into account when writing the text.
971c8f7bcad8a3a128eadec80d5fcc41.png
After collecting the semantic core, there is a periodic need to adjust it. Why does this happen?
Firstly, the range of goods and services may change. No explanation is required here: we remove the unnecessary, add the necessary.
Secondly, user intents change, demand changes – all this needs to be periodically monitored and the promotion strategy adjusted. Therefore, it is recommended to recompile semantics once every six months to a year.
Thirdly, the information section appears/expands, and, accordingly, new queries can be added to the semantics.
Structural improvements
We collected semantics, clustered them, compared them with the existing structure and identified which pages were missing. The structure adjustment can be point-by-point or it can be a radical reworking. At different stages of the project life, structural revisions of different types may be required.
Example 1: creating additional pages specifically for queries
Website of a manufacturer of tent structures. In 2018, to be in the TOP-10 queries for "arched frame hangar" and "quickly erected arched hangars", it was enough to optimize the general "Products" section on the website. In 2019, the situation changed: separate pages for arched hangars appeared in the TOP, the website's positions for this query worsened to TOP-20. Creating a separate page for this type of product allowed us to return the website's positions to the TOP-10.
Example 2: Forming a new page level
Website for selling tires and wheels. The catalog structure included pages for individual brands and nested pages with models. The pages with models presented all available tire/wheel radii. While all competitors had tire radii on separate pages nested in models (i.e. there was one more additional level in the structure). Accordingly, the volume of competitors' websites was larger. Creating an additional level of pages allowed us to increase the useful volume of the website, implement pages according to the principle of 1 product offer = 1 page, and ultimately increase traffic.
Example 3: Implementing new filtering options and creating pages based on the new filters
Online store of furniture fittings. According to the collected semantics, pages with certain types of handles are required: knob handles, bracket handles, rail handles, etc. But there was no such option in the product properties and filters. Adding the necessary properties and creating separate pages by product types allowed us to bring popular queries to the TOP and bring additional traffic to the site.
Example 4: Complete restructuring
This happens less often, but it does happen.
Working with text content
Let's clarify right away that this refers to all text content that is possible on the site, and not just page titles and texts in catalog sections. By the way, texts in catalog categories are most often not needed at all, contrary to the established stereotypes that very often appear in automatic SEO audits. We are talking about all tags and text blocks that are possible on the site: title, description, h1-h6, text on listings, text in product previews, announcements to publications, information section, UGC (User-generated content or user content), etc. Accordingly, the scope of work is significantly wider than writing X texts in Y time.
Work with texts most often occurs in iterations. Yes, there are certain requirements for writing texts, but how the search engine will take into account the occurrence of key queries and words in the content that set the topic can only be analyzed after publication.
Note that the "from general to specific" approach allows you to save time before receiving the first results. For example, before collecting a full semantic core for a large online store, you can generate a template title and description. To create a template, a superficial analysis of competitors and statistics from Wordstat is enough. This work will take a couple of days, including implementation. And while the semantic core is being collected for the next 3-6 weeks, these tags will already give a partial effect in the form of keyword positions and will begin to attract traffic.
Of course, some semantics will be missed, but this will be corrected later. And while semantics are being collected, the site will already have accumulated behavioral factors, the optimizer will have knowledge of where individual development is not needed at all, and the site owner will have N-th amount of additional profit.
Correction of errors and shortcomings on the site
Basically, such improvements are carried out according to a checklist: the entire site is checked for shortcomings, and all identified errors are corrected. But the site is constantly being improved and filled with new information, that is, the appearance of new errors is inevitable. Therefore, repeated checks for errors are subsequently carried out. And again, as soon as they are identified, all shortcomings are corrected. Thus, work on monitoring the technical condition of the site moves to the background mode.
Why some work is done iteratively
After the most critical errors have been corrected and the initial development has been completed, the work is structured according to the following principle: we determine the vector of actions, put forward and test several hypotheses, and ultimately choose the most successful solution.
If we cannot predict the final result 100% in advance, and the total volume of work is large, then the best option would be to act in iterations. In other words, we act according to the algorithm:
analysis of the situation –> formation and implementation of recommendations –> monitoring the result –> adjustment.
An analogy can be drawn with the development of software products, where there is such an approach as “Gradual improvement” – starting with a basic (minimum) configuration and gradually, step by step, increasing the functionality of the product being developed.
For example, after the effect of the work done at the first stage appears, work is carried out to increase CTR. The goal is to increase traffic by increasing clickability in search results.
To sum it up
SEO involves finding the most optimal way to achieve results, as well as regular monitoring of indicators and analysis of the project status. Search results are an unstable environment: Yandex and Google are upgrading their algorithms, competitors are not standing still. In addition, over time, new technical errors inevitably appear on the site. An iterative approach is the optimal way to work in unstable conditions.
"Why do we need to rebuild the semantic core? We built it a year ago, and we are happy with it!"
“Why do plans/reports include items such as ‘Title adjustment’, ‘Text adjustment’, ‘Internal link adjustment’?
"Is it really impossible to do it right the first time?"
Such questions can often be heard from content writing service customers and potential clients. The fact is that, as in any field, search engine promotion has its own specifics.
SEO is a complex process, the effect of the work is always delayed, and the final result is influenced by many external factors. Working in conditions of uncertainty, we need to find the best way to complete tasks. It is in such circumstances that an iterative approach is the best way out.
An iterative approach (English iteration - “repetition”) is the execution of work in parallel with continuous analysis of the results obtained and adjustment of previous stages of work.
In this article we will tell you why you can’t just take and do everything at once with a lifetime guarantee of 100% results.
Main SEO works
First, let's look at what kind of work SEO optimization includes in general. In order to successfully adapt a website to user demand, you need to study the features of this demand, analyze competitors, evaluate the problems of the website and its advantages over competitors. Then you need to find growth points, implement and track changes, analyze the result.
In short, SEO optimization work includes:
identifying and eliminating technical problems and shortcomings that may affect the ranking of the site;
formation of a semantic core and clustering of collected queries;
refinement of the site structure based on collected and clustered semantics;
optimization of tags, text and graphic content, internal linking;
work to improve the presentation of the site in search results;
work to improve user interaction with the site;
external optimization work;
scaling up the achieved results (regional promotion and promotion in the CIS countries);
tracking innovations and prompt implementation of relevant improvements.
As you have already noticed, all the work is multifaceted. Some of it is working through standard checklists, others are analyzing the current situation and taking corrective actions. And a separate pool of work is reduced to “keeping your finger on the pulse” and responding promptly to any changes. The sequence of work is determined by the priority of the tasks.
How priorities are distributed
There is no universal recipe, as all projects are different. It is important to find out which changes need to be implemented first, and which can be done after the effect of what has been done previously is clear.
What is done first:
website audit and subject matter assessment,
fixing critical issues that clearly affect ranking: these could be any problems with indexing, sanctions from search engines for any violations, and so on.
Measures to solve such problems must be taken immediately, otherwise all further work will be meaningless.
What is done second:
semantic collection and clustering;
structural improvements;
template tag optimization;
working with text content;
correction of errors and shortcomings on the site (what was not critical, but still requires attention);
external optimization.
It would seem that there is nothing complicated, it is enough to perform all the work step by step. However, there are certain nuances here. The sequence of works may differ depending on the subject of the site and the volume of the semantic core.
Semantic collection and clustering
Collecting a complete semantic core is the foundation for further actions. It is the core that determines the sequence of work.
Let's consider what options are possible:
Option 1
Narrow subject matter or medium-sized semantics. In this case, collecting a full semantic core takes from 1-2 days to a week. Work is carried out according to the usual plan.
Example – lift rental services. Narrow subject matter, limited supply, local business.
Option 2
Subject matter with a huge semantic core. For example, a product catalog with different name formulations, product groups with different characteristics and properties.
An example is an online store selling commercial equipment, the range of which includes automation equipment (payment terminals, POS systems), weighing equipment (from the simplest household spring balances to industrial crane scales), industrial furniture (metal cabinets, racks and safes) and many other product groups.
In this case, the complete collection of semantics can take several weeks, and in some cases - more than a month. And what to do with optimization? Wait until the complete semantics is collected? It is optimal to build the work on the principle from the general to the specific. First, automation, then - point elaboration by sections:
Template semantics + template optimization.
Analysis of successful competitive sites.
Full individual semantics and super-clustering (i.e. splitting clusters into smaller ones, contrary to clustering by TOP, for example, by semantic proximity. Allows for more precise optimization of pages for queries).
Option 3
A topic in which there is demand for several main queries and a huge number of micro-frequency queries. Micro-frequency means that each individual query is rarely searched for, as a result of which the data does not fall into the monthly statistics of Yandex.Wordstat . But there are a lot of such queries (tens of thousands) and in general they can potentially bring a lot of traffic. In this case, we do not have enough information about which queries are popular with users and which are unpopular.
An example of such a topic is auto parts, where there are several main keys “auto parts/parts + buy/price/toponym/brand/brand+model” and a huge number of requests with VIN number + buy/toponym and so on.
Manual work makes sense only for general queries auto parts/spare parts. It is pointless to collect a semantic core for specific parts; it is enough to define general patterns and iteratively work out templates, and not individually for each key/group of keys with a VIN number. All adjustments to the inclusion of keywords in the page code are made in a cycle:
do –> track the result –> make adjustments –> track the result again
The assembled core determines which specific formulations should be used on the site (in tags, texts, product names and properties), and which ones will be useless in terms of attracting users.
Clustering allows you to understand which queries can be promoted together on one page, and which queries need to be promoted on different pages. It also helps to determine which of them are not suitable for promotion on this site at all (for example, if the queries are informational or relate to another topic, which is not always obvious from the wording).
Example
An information portal on medical topics decided to place an article about asthenia to attract traffic. The question arose: what is the best way to place the information? Publish it as one long text (5000 – 7000 characters) or split it into several short articles for different search queries ("asthenia in children", "asthenia in women", "asthenia how to fight"). Demand analysis and clustering led to the following conclusions:
In this example, we can separate “Asthenia in Children” and “Asthenia in Men” into separate articles, since these key queries are defined as separate independent clusters.
fcc60d8863346d161d2b43d030df9428.png
The wording "asthenia how to fight" is not popular among users. Much more often they ask in the wording "how to treat". It is also worth using the words "how to get rid of", "how to cope". This should be taken into account when writing the text.
971c8f7bcad8a3a128eadec80d5fcc41.png
After collecting the semantic core, there is a periodic need to adjust it. Why does this happen?
Firstly, the range of goods and services may change. No explanation is required here: we remove the unnecessary, add the necessary.
Secondly, user intents change, demand changes – all this needs to be periodically monitored and the promotion strategy adjusted. Therefore, it is recommended to recompile semantics once every six months to a year.
Thirdly, the information section appears/expands, and, accordingly, new queries can be added to the semantics.
Structural improvements
We collected semantics, clustered them, compared them with the existing structure and identified which pages were missing. The structure adjustment can be point-by-point or it can be a radical reworking. At different stages of the project life, structural revisions of different types may be required.
Example 1: creating additional pages specifically for queries
Website of a manufacturer of tent structures. In 2018, to be in the TOP-10 queries for "arched frame hangar" and "quickly erected arched hangars", it was enough to optimize the general "Products" section on the website. In 2019, the situation changed: separate pages for arched hangars appeared in the TOP, the website's positions for this query worsened to TOP-20. Creating a separate page for this type of product allowed us to return the website's positions to the TOP-10.
Example 2: Forming a new page level
Website for selling tires and wheels. The catalog structure included pages for individual brands and nested pages with models. The pages with models presented all available tire/wheel radii. While all competitors had tire radii on separate pages nested in models (i.e. there was one more additional level in the structure). Accordingly, the volume of competitors' websites was larger. Creating an additional level of pages allowed us to increase the useful volume of the website, implement pages according to the principle of 1 product offer = 1 page, and ultimately increase traffic.
Example 3: Implementing new filtering options and creating pages based on the new filters
Online store of furniture fittings. According to the collected semantics, pages with certain types of handles are required: knob handles, bracket handles, rail handles, etc. But there was no such option in the product properties and filters. Adding the necessary properties and creating separate pages by product types allowed us to bring popular queries to the TOP and bring additional traffic to the site.
Example 4: Complete restructuring
This happens less often, but it does happen.
Working with text content
Let's clarify right away that this refers to all text content that is possible on the site, and not just page titles and texts in catalog sections. By the way, texts in catalog categories are most often not needed at all, contrary to the established stereotypes that very often appear in automatic SEO audits. We are talking about all tags and text blocks that are possible on the site: title, description, h1-h6, text on listings, text in product previews, announcements to publications, information section, UGC (User-generated content or user content), etc. Accordingly, the scope of work is significantly wider than writing X texts in Y time.
Work with texts most often occurs in iterations. Yes, there are certain requirements for writing texts, but how the search engine will take into account the occurrence of key queries and words in the content that set the topic can only be analyzed after publication.
Note that the "from general to specific" approach allows you to save time before receiving the first results. For example, before collecting a full semantic core for a large online store, you can generate a template title and description. To create a template, a superficial analysis of competitors and statistics from Wordstat is enough. This work will take a couple of days, including implementation. And while the semantic core is being collected for the next 3-6 weeks, these tags will already give a partial effect in the form of keyword positions and will begin to attract traffic.
Of course, some semantics will be missed, but this will be corrected later. And while semantics are being collected, the site will already have accumulated behavioral factors, the optimizer will have knowledge of where individual development is not needed at all, and the site owner will have N-th amount of additional profit.
Correction of errors and shortcomings on the site
Basically, such improvements are carried out according to a checklist: the entire site is checked for shortcomings, and all identified errors are corrected. But the site is constantly being improved and filled with new information, that is, the appearance of new errors is inevitable. Therefore, repeated checks for errors are subsequently carried out. And again, as soon as they are identified, all shortcomings are corrected. Thus, work on monitoring the technical condition of the site moves to the background mode.
Why some work is done iteratively
After the most critical errors have been corrected and the initial development has been completed, the work is structured according to the following principle: we determine the vector of actions, put forward and test several hypotheses, and ultimately choose the most successful solution.
If we cannot predict the final result 100% in advance, and the total volume of work is large, then the best option would be to act in iterations. In other words, we act according to the algorithm:
analysis of the situation –> formation and implementation of recommendations –> monitoring the result –> adjustment.
An analogy can be drawn with the development of software products, where there is such an approach as “Gradual improvement” – starting with a basic (minimum) configuration and gradually, step by step, increasing the functionality of the product being developed.
For example, after the effect of the work done at the first stage appears, work is carried out to increase CTR. The goal is to increase traffic by increasing clickability in search results.
To sum it up
SEO involves finding the most optimal way to achieve results, as well as regular monitoring of indicators and analysis of the project status. Search results are an unstable environment: Yandex and Google are upgrading their algorithms, competitors are not standing still. In addition, over time, new technical errors inevitably appear on the site. An iterative approach is the optimal way to work in unstable conditions.