Usually designated by the term "SEO" (in English SEO
Search Engine Optimization, translate search engine optimization) the set of
techniques for improving the visibility of a website:
submission (submission in English) of publicize the site with search tools;
position (ranking in English) of positioning the pages of a site in a good position in the results pages for certain keywords;
The difficulty of the exercise lies not in promoting the site with search engines in structuring the content and the internal and external mesh to be well positioned in the results on previously chosen keywords.
Indeed, a majority of Internet users use search engines to find information and questions as such a search engine using key words (in English keywords). It is therefore essential foremost thing to be concerned about the proposed content to best meet the needs of users, and to identify the keywords that can be seized by them!
submission (submission in English) of publicize the site with search tools;
position (ranking in English) of positioning the pages of a site in a good position in the results pages for certain keywords;
The difficulty of the exercise lies not in promoting the site with search engines in structuring the content and the internal and external mesh to be well positioned in the results on previously chosen keywords.
Indeed, a majority of Internet users use search engines to find information and questions as such a search engine using key words (in English keywords). It is therefore essential foremost thing to be concerned about the proposed content to best meet the needs of users, and to identify the keywords that can be seized by them!
SERP
The term SERP (Search
Engine Result Pages) denotes search results as displayed after a request. It is essential to understand that from one user to the other
results for the same search engine can vary, firstly depending on the setting
chosen by the user (language, number of results per page), but also by location
(country, region) where the request is made or the terminal (mobile, tablet,
desktop) or sometimes between requests previously made by the user, and finally
because search engines are regularly A / B testing to test different displays.
As such, it is not uncommon that a site disappears SERP on a motion for 24 to
48 hours, and then redo its appearance. This means that it takes a minimum 72
hours before worrying.
This means that it is not because you see yourself in first place as you are bound. To obtain the closest possible result that the majority of users see, it is advisable to disable history queries, or to navigate using private browsing their browser.
The pages referenced in the first position obviously get more visits, then come in second pages, etc. It is the same for the pages referenced by top relative referenced pages in a second page. Thus, if a page is in the 11th position (hence the second page), it is very interesting to try to optimize it to get it to the first page and get a significant gain unique visitors.
This means that it is not because you see yourself in first place as you are bound. To obtain the closest possible result that the majority of users see, it is advisable to disable history queries, or to navigate using private browsing their browser.
The pages referenced in the first position obviously get more visits, then come in second pages, etc. It is the same for the pages referenced by top relative referenced pages in a second page. Thus, if a page is in the 11th position (hence the second page), it is very interesting to try to optimize it to get it to the first page and get a significant gain unique visitors.
Keywords
SEO makes sense
vis-à-vis key words (in English keywords), that is to say, the words used by
visitors to search. The first task is to determine the keywords on which you
wish to position the pages of its website. The keywords that you have in mind does
not always correspond to the keywords used by visitors, as they tend to use
more or shortest possible terms to make spelling mistakes.
There are tools to compare search volume of a keyword over another and giving suggestions:
Finally, there are sites to find the keywords competing sites:
SEMRush.com
Black hat SEO / White hat
In terms of SEO,
generally between two schools of thought:
White hat SEO (read white hat), pointing SEOs strict accordance with the guidelines of search engines to webmasters, hoping to achieve a sustainable SEO playing with the rules of the game;
White hat SEO (read white hat), pointing SEOs strict accordance with the guidelines of search engines to webmasters, hoping to achieve a sustainable SEO playing with the rules of the game;
Black hat SEO
(translate black hat), designating SEOs adopting technical contrary to the guidelines
of search engines in order to get a quick gain on high monetization potential
pages, but with a high risk of downgrade. SEO black hat and playing cat and mouse
with the search engines, which regularly adjust their algorithms for
identifying and listing sites not complying with the instructions. Techniques
such as cloaking or happy spinning are thus considered dangerous and not
recommended.
Submit your website
Before talking about
search engine optimization, the first step is to ensure that the major search
engines, especially Google (because it is the most used) identify the site and
come browse regularly.
To do this, there are online forms for
submitting its website:
Submit your site
on Google
Submit your site
on Bing
Submit your
website to Yahoo
Owner
SEO is not necessarily paying because
search engines index the content of sites for free and it is not possible to
pay them to better position its site.
Paid Search
However it is possible to buy a share of
keywords on the search engines, then it is advertising location (called
sponsored links), located around the so-called natural search results. This is
called SEM (Search Engine Marketing) as opposed to SEO (Search Engine
Optimization).
On the other hand, SEO is a broad concept,
asking a lot of experience and with many hidden difficulties, it is advisable
for companies to use its specialized agencies in SEO that will advise and
accompany them.
Search engine
optimization
The reference for the search engines is
the web page, so you have to think, in the design of the website, to structure
pages taking into account the above tips for each page.
In fact most webmasters think to index correctly the homepage of their site but abandoning other pages, or it is usually the other pages that contain the most interesting content. It is therefore imperative to select a title, URL and meta (etc.) for each of the pages.
There are some site design techniques to provide more effective SEO pages of a site:
an original and attractive content,
a well chosen title
appropriate URL,
readable text body by engines,
META tags that accurately describes the content of the page
thoughtful links
ALT attributes to describe the content of the images.
In fact most webmasters think to index correctly the homepage of their site but abandoning other pages, or it is usually the other pages that contain the most interesting content. It is therefore imperative to select a title, URL and meta (etc.) for each of the pages.
There are some site design techniques to provide more effective SEO pages of a site:
an original and attractive content,
a well chosen title
appropriate URL,
readable text body by engines,
META tags that accurately describes the content of the page
thoughtful links
ALT attributes to describe the content of the images.
Content of the web page
Search engines seek
above all to provide quality service to their users by giving them the most
relevant results based on their research, before even thinking to improve the
ranking it is essential to focus on creating content consisting and original.
Original content does not mean content
that is offered by any other site, this would be an impossible mission. However
it is possible to treat a subject and bring him a profit deepening some points
by organizing it in an original way, or by connecting different information.
Social networks are thus an excellent way to promote content and identify the
interest that readers bring to your content.
On the other hand, always with the
objective of providing the best content to visitors, search engines attach
importance to updating information. The act of updating the site therefore
increases the index provided by the engine to the website or in any case the
transition frequency of the crawler.
Page title
The title is the
preferred element to describe briefly the content of the page, this is
especially the first item that the visitor will read in the result page of the
search engine, so it is essential to grant him particular importance. The title of a web page is described in the
header of the website between the <TITLE> and </ TITLE>.
The title should describe as precisely as
possible, in 6 or 7 words maximum, the content of the web page and its
recommended total length should ideally not exceed sixty characters. Finally,
it should ideally be as unique as possible in the site so that the page is not
considered duplicate content.
The title is all the more important that
this information that will appear in the favorites of the user in the title bar
and browser tabs as well as in the history.
Given that European users read from left to right, it is advisable to put the words with the lowest sense of
Given that European users read from left to right, it is advisable to put the words with the lowest sense of
Wikipedia.
Make sure in particular that each page on
your site has a unique title, including pages with pagination. In the latter
case, for example you can make the pages beyond page 1 do include the page
number in the title.
URL of the page
Some search engines
attach major importance to keywords present in the URL, including the keywords
present in the domain name. It is therefore advisable to put a
suitable file name containing one or two keywords for each site files rather
than names of page1.html kind page2.html, etc.
Kioskea uses a URL-Rewriting known technique of writing readable URLs containing the keywords in the page title. CCM on the dash is used as a separator:
Kioskea uses a URL-Rewriting known technique of writing readable URLs containing the keywords in the page title. CCM on the dash is used as a separator:
Body of the page
To make the most of the content of each
page it is necessary that it be transparent (as opposed to opaque content such
as Flash), that is to say, it has a maximum of text, index-able by engines. The
content of the page should be primarily addressed quality content to visitors,
but it is possible to improve it by ensuring that different keywords are
present.
The frames (frames) are strongly
discouraged as they sometimes prevent indexing of the site in good conditions.
META tags
META tags are not displayed to insert at the beginning of the HTML document tags to finely describe the document. Given the misuse of meta found in a large number of web sites, engines use less this information when indexing pages. The meta tag "keywords" has been officially abandoned by Google
Meta Description
The Meta description
tag is used to add a description describing the page without displaying them to
visitors (e.g., words in the plural, even with voluntary spelling mistakes). This is usually the description (or part of this description) that
will appear in the SERP. It is advisable to use HTML coding for accented
characters, and not to exceed twenty keywords.
META robots
The robots meta tag is particularly
important because it allows to describe the behavior vis-à-vis robot the page
and indicate whether the page should be indexed or not and if the robot is
authorized to follow the links.
By default no robots tag indicates that
the robot can index the page and follow the links it contains.
The tag robots can take the following
values:
index, follow: This statement amounts to
not put robots tag as this is the default behavior.
No index, follow: the robot does not index
the page (but the robot can return regularly to see if there are new links)
index, nofollow: the robot does not follow
the links on the page (as against the robot can index the page)
no index, nofollow: the robot no longer
index the page or follow the links. This will result in a drastic decrease in
the frequency of visits of the page by robots.
Here is an example of robots tag:
<Meta name = "robots" content
= "noindex, nofollow" />
Also note the existence of the following
value, which can be combined with the previous values:
noarchive: the robot does not offer users
the cached version (especially for the Google cache).
noodp: the robot should not offer the
description from DMOZ (Open Directory Project) Default
It is possible to specifically target the
crawlers from Google (Googlebot) substituting the name robots Googlebot (but it
is advisable to use the standard tag to remain generic):
<Meta name = "googlebot"
content = "noindex, nofollow" />
In cases where a large number of pages
should not be indexed by search engines, it is best to block via the robots.txt
because in this case crawlers do not waste time and crawler these pages and can
focus all their energy on useful pages.
On the forum Kioskea issues that have not
been answered are excluded from the search engines, but they can continue to
crawl the pages to follow links:
<Meta name = "robots" content
= "noindex, follow" />
After a month, if the question is still no
response, the meta tag becomes the next, so that the motor forget:
<Meta name = "robots" content
= "noindex, nofollow" />
Internal links
In order to give maximum exposure to each
of your pages, it is advisable to establish internal links between your pages
to allow crawlers to browse your entire tree. So it may be interesting to
create a page with the architecture of your site containing pointers to each of
your pages.
This means that by extension the site
navigation (main menu) must be thought to effectively provide access to pages
with high potential in terms of SEO.
Netlinking
The term refers to the fact net-linking
get external links to its website as it increases traffic on the one hand and
the reputation of its site, and secondly because search engines take into
account the number and quality links for a site to characterize its relevance
level (in the case of Google with its index called Page-rank).
Nofollow links
The links are by
default followed by search engines (in the absence of meta robots nofollow or a
robots.txt file to prevent indexing of the page). However, it is possible to tell search engines not to follow
certain links using the nofollow attribute.
This is particularly recommended if:
The link is the
subject of a trade agreement (paid links)
The link is added
by untrusted users contributing areas of the site (comments, ratings, forums,
etc.).
Kioskea on the links posted by anonymous
users who have not actively participated in the community (Help with Forums)
are nofollow links. The links posted by active users and contributors are
normal links (called "dofollow").
ALT attributes images
Site images are opaque to search engines,
that is to say, they are not able to index the content, so it is advisable to
put an ALT attribute on each image, allowing for describe the content. The ALT
attribute is also crucial for the blind, navigating using Braille terminals.
An example of ALT attribute:
<Img src = "images /
keyword-promotion.gif"
width = "140"
height = "40"
border = "0"
alt = "logo Kioskea">
It is also advisable to inform a title
attribute to display a tool-tip to the user describing the image.
Improve the crawl
SEO starts with the
crawl (in French exploration) of your site by exploration of the search engines
robots. These agents browsing websites looking for
new pages to be indexed pages, or update. A crawler is sort of like a virtual
visitor: it follows the links on your site to explore the maximum pages. These
robots are identified in the logs by the HTTP header User-Agent they send. Here
are user-agents of the major search engines:
Googlebot, etc.
Below are examples of User-Agent strings
for the most popular search engines:
Object
Thus, it should make to mesh intelligently
its pages with links to enable robots to access the maximum pages as quickly as
possible.
To improve the indexing of site, there are
many useful methods:
Robots.txt
It is possible and
desirable to block unwanted pages referencing using a robots.txt file to allow
crawlers to devote all their energy to useful pages. Duplicate pages (e.g. having unnecessary settings robots), or
pages with little interest to visitors from research (internal site search
results, etc.) must typically be blocked;
Kioskea on the results of the internal
search engine are explicitly excluded from listing through the robots.txt file,
so as not to provide users arriving via a search engine results automatically
generated according to the guidelines of Google.
Page load speed
It is important to
improve page load time, using for example caching mechanisms because it allows
one hand to improve the user experience and therefore visitor satisfaction and
secondly because the engine search are increasingly into account these
types of signals in the positioning of the pages;
Sitemap
The fact of creating
a sitemap file allows robots to provide access to all of your pages or last
pages indexed.
Social Networks
More and more search
engines take into account the social sharing signals in their algorithm. Google Panda takes into account the criterion for determining
whether a site is quality or not. In other words, the fact of promoting social
sharing reduces the risk of impact by algorithms such as Panda.
Kioskea on the pages contain asynchronous
sharing buttons so as not to slow loading pages and meta Open Graph og: picture
to indicate social networks which image display when a user shares a link.
A mobile site SEO
The ideal is to have
a mobile site designed for responsive design because in this case the page
indexed for desktop computers and mobile devices is the same, only the display
changes in the display device.
If your mobile website is on a domain or
sub-domain separately, as is the case for Kioskea, simply automatically
redirect users to the mobile site, making sure that each page redirected
pointing at the equivalent mobile site. It should also ensure the crawler
Google-bot-Mobile is well treated as a mobile terminal!
Google said that as of April 21, 2015, the
"mobile-friendly" pages have an SEO boost on non-mobile friendly
pages within mobile search results. This boost is applied page by page and is
revalued over water for each page, depending on whether it passes or not the
test.
To deepen: a mobile site SEO
Duplicate content
Whenever possible, it
is to create unique page titles throughout the site because search engines like
Google tend to ignore duplicate content (duplicate content in English), that is ie either many of the site with the same title or website pages
whose main content is on the site or third party sites.
Duplicate content is something natural,
not least by the fact that we are led to make quotations, report about
personalities or to make reference to official texts. However, too much of duplicate
content on a site can lead to an algorithmic penalty, so it is advisable to
block such content using a robots.txt file or robots meta tag with a value of
"noindex" .
Canonical tag
When the search
engines detect duplicate content, they still retain a single page, depending on
their own algorithms, which can sometimes lead to errors. Thus, it is advisable to include in the pages with duplicate
content Canonical tag pointing to a page to keep. Here is the syntax:
<Link rel = "canonical" href
= "http: // your-site / page-finale" />
In general, it is advisable to include in
your pages a canonical tag with the URL of the current page. This allows in
particular to limit the loss associated with unnecessary parameters in the URL.
It is also used for index pages because
sometimes Google indexes your home page in its shape.
Penalties
Two types of
penalties:
Manual penalties,
that is to say, resulting from human action, following a breach of Webmaster
Guidelines. It may be unnatural links (purchased),
artificial content, sneaky redirects, etc. Penalties for buying links are
common and penalize the site that sold links as well as those who bought them.
These penalties may only be exercised after having corrected the problem
(assuming you have identified the problem) and made a request for review of the
site via the dedicated form. The review of a website can take several weeks and
does not necessarily lead to a position or sometimes partial recovery;
Algorithmic
penalties, that is to say, resulting in no human action, usually related to a
combination of factors that only the search engine knows. This is the case for example of Google Panda, Google's algorithm
downgrading the said sites of poor quality or Google Penguin algorithm
targeting bad SEO practices. These penalties can not be exercised at almost
having eliminated the "signals" leading to a downgrade in the next
iteration of the algorithm.
Google Algorithm
Google's algorithm is
the set of Instruction allowing Google to give results page following a
request.
PageRank
Originally the
algorithm was solely based on the study of the links between web pages based on
an index assigned to each page and named PageRank (PR). The principle is simple: the more a page of incoming links, the
higher its PageRank increases. Over a page's PageRank, the more it distributes
to its outgoing links. By extension, we speak of a site PageRank to designate
the PageRank of the home page, as this is usually the page that has the highest
PageRank of all the pages.
Optimization of the
algorithm
Since the PageRank,
the algorithm takes into account a large number of additional signals,
including (non exhaustive list):
the freshness of information;
the mention of the
author;
the time spent,
the degree of involvement of the reader;
sources of traffic
other than SEO
etc.
Google announces carry about 500 of the
algorithm optimizations per year, or more than one change per day. Therefore,
the SERP may vary significantly depending on the changes made by Google team.
Google Caffeine
Google Caffeine is
the name given to the new architecture deployed by Google in August 2009 (and
regularly improved since), whose aim is more rapid consideration of updated
information, which results in improved crawl and Therefore fresher results in search results.
Google Panda
Panda is the name
given to the filter deployed in 2011 by Google to fight against bad qualities
sites. The idea is to degrade the positioning of
sites whose content is considered too low quality:
Google Penguin
Deployed in 2012,
Google's Penguin update Google penalizing sites SEO optimization is considered
excessive. This is the case for example sites too
many links come from sites deemed as "spamming". It also seems that
an abuse of links between pages speaking of disparate subjects is a factor that
can result in a penalty via the Google Penguin algorithm. Google has set up a
form to disavow links potentially detrimental to SEO a website. A major update
of the algorithm was deployed in May 2013 and in October 2014 (sees the history
of deployments Google Penguin).
Out of an algorithmic
penalty
First, make sure that
the decrease is well linked to an algorithm change. For this it is necessary to determine if the decline coincides
with a known deployment Panda or Penguin. If this is the case, there is very
likely that this is linked. It should be noted that the deployment can take
several days or even weeks, which means that the decline is not necessarily
brutal.
To get an algorithmic penalty, it is
advisable to do a manual review of its main contents pages and check point by
point if the quality is the appointment and the content is unique. In case of
insufficient quality, it will be necessary to change the content to improve or
to difference (or delete it). Penguin on Google, look the links pointing to the
site in Google Webmaster Tools and ensure that links are natural one hand, the
other good (at sites not having the air be spam.)