What is Search Engine Optimization (SEO)?

Where does the term Search Engine Optimization (SEO) come from? In the early 1990s, for the first time, the search engine was created. Before Google appeared in 1996 many search engines were created. The Web boom began. People realized that you could really make money with them. So they came to the conclusion that they needed to attract traffic on the website. What was the best method of attracting traffic? The search engines. At that time the owners of the websites began to think about how they could gain the first position in the search results.

Search Engine Optimization (SEO) focuses on organic search results, which is not paid.

What is Search Engine Optimization (SEO)?

Search Engine Optimization (SEO) is the technique of improving the visibility of a website/blog so that it ranked a better position in the organic results of the search engines. 

The goal is to appear in the highest possible position of organic search results for one or more specific keywords that have been previously selected and are relevant to our website/blog. This will increase visitor traffic of our website/blog.

The term Search Engine Optimization (SEO) is also used to refer to the people who perform the positioning work. Traditionally people who are dedicated to Search Engine Optimization usually have a technical profile (programmers, web designers, etc.) but with the new SEO trends, knowledge about marketing and communication is increasingly important.

Search Engine Optimization (SEO) is a discipline of online marketing with others such as SEM (Search Engine Marketing), SMM (Social Media Marketing), Email Marketing, Web Analytics, etc.

It is very important to understand that Search Engine Optimization is a job in the medium or long term, results sometimes take months to achieve. The number of keywords position is limited and depends a lot on the number of pages of which the website/blog is composed. Therefore, the initial selection of keywords for any site is very important.

In any case, it is impossible to guarantee the position that a keyword will obtain in a given period of time, since in SEO positioning there are many variables that can not be controlled, such as changes in search engine indexing algorithms.

Although there are hundreds of factors in which a search engine is based to position a page or another one could say that there are two basic factors,

  1. Authority 
  2. Relevancy 

Authority:

Authority is basically the popularity of a website. The more descriptive the information, the more valuable it is. The more contents are shared, the more useful it found by the users.

Relevancy:

Relevancy is the relationship of a page/post with a given search query. This is not necessary that a page contains the search term a lot of times, but a search engine is based on hundreds of on-site factors to determine this.

It can be divided into three major groups:

  1. On-Page SEO
  2. Off-Page SEO
  3. Technical SEO

On-Page SEO: 

On-site refers to the relevancy of the website, page, or post. It ensures that the website is optimized so that the search engine understands the main thing, which is the content of it. Within On-Page SEO  we would include keyword optimization, load time, user experience, code optimization and format of URLs.

Off-Page SEO: 

Off-site is the part of work that focuses on external factors to the web page in which we work. The most important factors in Off-Page SEO are the number and quality of the links, presence in social networks mentions in local media, brand authority and performance in search results.

Technical SEO:

It is a part of Search Engine Optimization, which focuses on the possibility of technical optimization of a website. These include, for example, source code, operating systems, and server configurations. Technical SEO is never solely responsible for good positioning of the keywords, but rather a prerequisite for future optimization actions.

Once we learned about the Search Engine Optimization, we must understand whether we follow or not the “Recommendations” of the search engine. Two types of techniques i.e. Black Hat SEO or White Hat SEO.

Black Hat SEO: 

Black hat is the attempt to improve the search engine positioning of a web page through unethical techniques or that contradict the guidelines of the search engine. Cloaking, Spinning, SPAM in forums and blog comments, or Keyword Stuffing are examples of Black Hat SEO. A black hat can provide benefits in the short term, but it is generally a risky strategy, without continuity in the long term and that does not add value.

White Hat SEO: 

White Hat SEO consists of all those actions which are ethically correct and meet the guidelines of the search engines to rank a web page/post in the search results. Since search engines give greater importance to the pages that best respond to a user’s search, White Hat understands the techniques that seek to make a search engine page more relevant by providing value to its users.

Importance of SEO:

The most importance of SEO is to make your website more useful for both users as well as search engines. Although a search engine cannot see a web page/post as a human does. Search Engine Optimization (SEO) is necessary to help search engines in understanding what each page of the website is about and whether or not it is useful for users.

SEO is the right way for the website users to find you through search query in which your website is relevant. These users look for what you offer them. It is the best way to find and reach them through a search engine.

How do search engines work?

The working of a search engine can be two steps: tracking and indexing.

  1. Tracking
  2. Indexing 

Tracking:

A search engine crawls the web tracking are called bots. These scroll through all the website pages through the links. Hence the importance of a good link structure. As would any user when browsing the content of the website, they pass from one link to another and collect data about those web pages that they provide to their servers.

The crawling process begins with a list of website addresses of previous crawls and sitemaps provided by other web pages. Once they access these web pages, the bots look for links to other web pages to visit them. The bots are addressed to new sites and to changes in existing web pages.

It is the bots themselves that decide which pages to visit, how often and how long they will crawl that website, that’s why it’s important to have an optimal loading time and updated content.

It is very easy in a website to not allow the search engine bots/crawlers for tracking of some pages or certain content to avoid that they appear in the search results. For this, through the “robots.txt” file you can tell search engine bots not to crawl certain pages/contents.

Indexing:

Once a bot has crawled a website and collect the necessary information, these pages are included in an index. In this way, when we make a query to the search engine, it will be much easier to show us the results that are more related to our query.

At first, the search engines were based on the number of times a word was repeated. When doing a search they traced those terms in their index to find which pages they had in their texts, positioning better the one that more times had it repeated. Currently, they are more sophisticated and base their indexes on hundreds of different aspects. The date of publication, if they contain images, videos or animations, microformats, etc. and they give more importance to the quality of the content.

Once the pages/posts are tracked and indexed by the search engine, the time comes in which the algorithm acts: algorithms are the computer processes that decide which pages appear before or after the search results. Once the search is done, the algorithms check the indexes and they will know which are the most relevant pages/contents taking into account the hundreds of positioning factors. And all this happens in a matter of milliseconds.

Subscribe Our Free Articles! Enter your Email...