|
|

The
page content needs to be optimised so that
both search engine visitors and human
visitors both deem it to be a useful
website. Awhile back,
effective optimisation entailed
content with as many keywords as possible and while this once generated
good search
engine results it invariably put visitors off when they found the
contents were irrelevant. It is
also now frowned upon and penalised as being "spam" by all of the major
search engines.

Search engines use specific algorithms to
determine the relevance of your website.
The calculations from these algorithms determine where on the search
engine result pages
your website will appear. In order to keep the SEO•s (Search Engine
Optimisation) from
calculating aspects of their algorithms and ensuring that results are
always up to date,
major search engines regularly update their search algorithms. To
optimize your website
successfully you need to understand how search engines work. To read more
about the
Anatomy of a Large-Scale Hypertextual Web
Search Engine
click on the link.
The result of some of the most recent algorithms updates has seen the
shift move away
from optimising websites for search-engines and instead the algorithms
focus on promoting
websites that give their searchers the best user experience. They're not
only changing,
they are evolving into more intelligent and accurate algorithms. SEO
should still focus on
optimizing their websites for specific keywords, but a rule of thumb
should be to think of
the end user experience as when building web pages.

PageRank: Bringing Order to the Web
The citation (link) graph of the web is an important resource that has
largely gone unused in
existing web search engines. We have created maps containing as many as
518 million of
these hyperlinks, a significant sample of the total. These maps allow
rapid calculation of a
web page's "PageRank", an objective measure of its citation importance
that corresponds
well with people's subjective idea of importance. Because of this
correspondence,
PageRank is an excellent way to prioritize the results of web keyword
searches. For most
popular subjects, a simple text matching search that is restricted to web
page titles
performs admirably when PageRank prioritizes the results. For the type of
full text searches
in the main Google system, PageRank also helps a great deal.
Academic citation literature has been applied to the web, largely by
counting citations or
backlinks to a given page. This gives some approximation of a page's
importance or quality.
PageRank extends this idea by not counting links from all pages equally,
and by
normalizing by the number of links on a page. PageRank is defined as
follows:
We assume page A has pages T1...Tn which point to it (i.e., are
citations). The parameter
d is a damping factor which can be set between 0 and 1. We usually set d
to 0.85. There
are more details about d in the next section. Also C(A) is defined as the
number of links
going out of page A. The PageRank of a page A is given as follows:
PR(A) = (1-d) + d (PR(T1)/C(T1) + ... + PR(Tn)/C(Tn))
Note that the PageRanks form a probability distribution over web pages,
so the sum of all
web pages' PageRanks will be one.
PageRank or PR(A) can be calculated using a simple iterative algorithm,
and corresponds
to the principal eigenvector of the normalized link matrix of the web.
Also, a PageRank for
26 million web pages can be computed in a few hours on a medium size
workstation.
There are many other details which are beyond the scope of this paper.

PageRank can be thought of as a model of user behavior. We assume there
is a "random
surfer" who is given a web page at random and keeps clicking on links,
never hitting "back"
but eventually gets bored and starts on another random page. The
probability that the
random surfer visits a page is its PageRank. And, the d damping factor is
the probability at
each page the "random surfer" will get bored and request another random
page. One
important variation is to only add the damping factor d to a single page,
or a group of pages.
This allows for personalization and can make it nearly impossible to
deliberately mislead the
system in order to get a higher ranking.
Another intuitive justification is that a page can have a high PageRank
if there are many
pages that point to it, or if there are some pages that point to it and
have a high PageRank.
Intuitively, pages that are well cited from many places around the web
are worth looking at.
Also, pages that have perhaps only one citation from something like the
Yahoo! homepage
are also generally worth looking at. If a page was not high quality, or
was a broken link, it is
quite likely that Yahoo's homepage would not link to it. PageRank handles
both these
cases and everything in between by recursively propagating weights
through the link
structure of the web.

Keyword optimisation is now
more heavily watched over. Those who include
keywords too
often will have their sites marked out as "spam". However, if your target
keyword does not
appear regularly enough, its ranking will not be as high as it could. The
algorithms have
become particularly smart and as well as the keywords you want to target
you should
include other relevant keywords.
Using variations of a keyword is an excellent way to ensure that your
site is viewed to be
relevant. Variations are slight changes to your keyword. For example,
variations of the
keyword "car" include cars, vehicle, vehicles, etc.

• Title-tag
• Meta-tags
• Header-tags (H1 tags)
• Image-alt-tags
• Formatting-tags
Weight is also given to keywords that are
included in certain sections of a page.
These
sections include the title-tag, meta-tags (only relevant to smaller
search engines now),
header-tags, image-alt-tags and formatting-tags (e.g. keywords in bold or
italicised) of your
text. With image-alt-tags and hyperlink title-tags it is important that
you don't simply fill
these with keywords because this will be ignored at best and at worst
penalised.
Never overemphasis keywords, again, this of the end user first, if is
seems cluttered to a
human, then the search engine algorithms will probably work that out
mathematically.

Your content should be made up of all of
your keywords and other text. A total
keyword
density (all keywords) of around 15% to 20% is the maximum you should aim
for and
anything less than 5% is unlikely to yield good results. Density for a
single keyword should
be between 2% and 6%. 2% seems too low and 6% a little too high. If
possible aim for
approx 5% with the primary keyword and 2-3% with secondary and subsequent
keywords.

Text Formatting (e.g. strong, bold and
underline): This may not offer much weight
in
algorithms, but generally if you bold the first instance of your keywords
and the last
instance of your primary keyword you should see some positive results.

The closer you can get your keywords to the
beginning of your page (top left) the
better.
Try to include your primary keyword
within the first few sentences and also
within the last
paragraph.

If you are targeting "Property Investment"
as a key phrase then do not split the words
up if
possible. Some effect is noticed if
the words are split, but much more benefit
is received by
including the phrase as a
whole.

Include your keyword at least once in the
Alt tag of any images. Ensure that the text
is
relevant to the image and gives some
information.

A better Google Page Rank (PR) will improve
your search engine ranking, and thus deliver
increased traffic to your site. Incoming
links from other quality websites with PR4
and
above should be your main target. If you
don't have the Google Toolbar installed, you
should definitely install it now - it will
help you identify the websites and
directories that will
be the most beneficial
to your own.
We have highly qualified web developers
and web optimize experts to assist you in
getting
the most out of your website. If you really want to do some of the work
yourself • Please
refer to our
10 steps to optimize your website.
|
|
|
|
|