One of the most common terms you’ll hear being thrown around internet forums dedicated to SEO is the term “Search Engine Optimization”. Even if you are a total newbie to the net, you have most likely already used a search engine, and have likely been using them for years. Search engines are sites that allow us to search for keywords on the net, and return results in a nice, organized way. Based on the file type we can filter if we want to see videos, pictures, audio, power point presentation, and other files types. Some major search engine sites include Google.com, Yahoo.com, and Bing.com. Other fringe website engines exist, but they tend to be less well known than the major ones.
Prior to search engines, it was nearly impossible to discover a website without knowing the address of the site beforehand. The entire purpose of search engines is to make it possible for people to find the things that they are looking for on the net without forcing them to memorize millions of internet addresses. Sites like Google not only categorize pages by the topics that they are about, but also categorize them by popularity.
The more popular and more high quality a website has been proven to be, the more likely it is to show up on search results that are related to the content that is on the site. Nowadays you can have a website address that is starting with www. or without www. It is up to the site owner preferences if he want’s or not the www. version.
Search terms are also known as keywords, and they are the backbone of search engine result categorization. Websites that are considered to be the most relevant to the search terms that you enter are the ones in the top 5 slots on the first page. These get the vast majority of the traffic that one can receive by search engine for that term. Sites that are less relevant to a series of keywords will end up on a lower results rung. After a certain point, many website engines will simply return slightly relevant sites as results.
Most people never will see those sites, simply because of how lowly ranked they are. In Google, when you usually search for a term you will also get local listings. Those listings must be set up, and claimed by the local business owner in order to get listed in the Google local directory.
The way that they do this is by using an indexing service to include the site on search engine results, and then using special robots called site crawlers to pick up on keywords and other similar things to properly categorize them. Crawlers regularly visit websites in order to make sure that the content doesn’t change too drastically, and to also make sure that their categorization is accurate. Crawlers pick up on a variety of different things, such as keyword placement, and the title of the page. They use this to determine what your site is really all about.
You can use Google Webmaster Tools, to see if your website gets crawl correctly, and you will also get notified if there is any problem when google bots is crawling your website or if you have any 404 errors, or any other type of crawling/linking error.
But, how, one wonders, does a search engine decide which sites are the most relevant, and which sites deserve the top ranking? How does a engine decide which sites are total bunk, and are just too low quality to count in a search index? The answer is that search engines use a special mathematical formula called an algorithm to decide all that. The algorithm combines a large number of factors to decide what site deserves the top ranking.
Because search engines are always trying to provide better and better content for users, algorithms are prone to change. In some cases in the past, the entire older algorithms for major web engines were scrapped, and a new algorithm, complete with new programming, was put in its place. There’s no saying that this won’t happen in the future again.
Originally, most seo algorithms only counted two or three factors, putting the most emphasis on the number of times a keyword would appear on a page and the number of directories that have linked to said site. This caused many major sites to become keyword-stuffed messes that really detracted from a person’s overall experience while browsing the net using search engines. Since then, search engines have implemented other signs of a site’s quality and relevance to their algorithms that make the entire process of deciding a page’s rank more involved and quality-based.
Some of the new things that have been added include filters that help prevent sites that are overuse keywords for the sake of getting a higher rank, filters that make note of site pages that have been submitted to blacklisted directories, as well as sites that contain malware. Newer algorithms also are more likely to focus on things like social media shares, discussion of said website by others, as well as your overall internet presence.
Basically, search engines are what make the internet as popular and as powerful as it is, and it’s through a search engine’s crawlers and unique algorithms that we are able to see the pages that we see when we search something up online. They are the biggest provider of organic traffic for sites of every genre, so it’s important to make sure that your website is good to go with most search engines. Google is currently the biggest search engine on the market today.