which is the link to the Web page that populates when you
type a keyword into a search engine and it returns a list of results.
üSelect keywords you can rank for. If you’re in the travel business, it may seem pretty tempting
to go for high-traffic keywords such as “hotels” or “airfare,” but the reality
is that it will probably be very difficult to rank for these keywords unless
you’re Expedia.com or the Hyatt. Choose keywords that your website can
realistically rank for and that will bring qualified traffic to your site.
üEarn quality links.
One of the best ways to increase your ranking in the SERPs is to build a
portfolio of high-quality links that point back to your site. Create great
content that others will naturally want to link to and reach out to related
websites that may be interested in linking to your site.
üResearch your competitors’ keywords. Are your competitors ranking in the SERPs for keywords that
you want to rank for? Are they targeting keywords that you haven’t considered
for your business? Research these keywords and think about how you might be
able to apply them to your SEO campaign.
üPrepare quality content.
As the old adage goes, content is king. Good content leads to natural links and
search engines generally like to see at least 250 words on a page. But content
isn’t just for the search engines – it’s for your customers, too! Be sure to
include content that will add value to your site from a customer perspective.
With the DomainTrust tool you can look up valuable information about your
website including its Age, PageRank, Backlinks, Indexed Pages, Alexa Rating,
Keyword Rankings, and much more.
Submit your website to the major search engines including Google, Yahoo!, Bing,
Ask, Jayde, Scrub the Web, and more.
This tool was designed to make it easy to spot check all the major check points
- the best part about the tool is that you can take the tool and put on your
website. If you like the tool please link to our website.
You can also click the "Get This Tool!" link in the bottom right
corner of the widget to get the code to place on your website for free.
Check out your website rankings on Google, Yahoo!, and MSN for up to 3 keyword
phrases.
The PageRank tool helps you find out the PageRank of any website.
Check your link popularity with the major search engines, and find out where
your top competitors stack up compared to your website.
Meta Tags no longer carrying the same amount of importance within the search
engine algorithms, but if used strategically, can influence the search engines
and entice searchers to click through to your site. We have been providing
webmasters with free Meta Tag generator capabilities for years to help them
implement proper meta tags. Please feel free to use our Meta Tag generator tool
below. To learn more about Meta Tags visit our search engine knowledge base.
Directory Submission Service
Submit your website information to web
directories and start building links. SEO Experts always recommend Directory
submission to high pagerank directories to get one way permanent links. Directory Submission Services from India
as submitting to web directories is most important factor to gain Pagerank,
Popularity, Visibility in Search Engines and more over Huge Traffic.
Advantages of
Directory Submission Service
We
will submit your site Manually.
No
Auto Submitter Software used.
We
will submit to SEO friendly and Non-Reciprocal directories.
We
will manually research the relevant categories and then submit your
website information.
You
will be provided detailed report in Excel sheet.
Time
would take 7 to 15 days time from 100 to 1000 directories submission
package.
Why Submitting to
Directories?
It is simple and easy to understand that why we need to
submit our site to web directories as it gets one way permanent links to our
site and helps in building popularity with the search engines to appear on
their SERP (Search Engine Results Page).
Important Advantages
of Directory Submission
We
can get direct traffic from the respective web directory.
We
get one way permanent link from the web directory.
Search
Engines respects as one of the effective link building method.
Manual
Directory Submission
Categories
Paid Directories– Means u have to pay money for directory submission.
Niche Directories
- If links come from niche specific sites, then they are given higher
value.
Niche directory submission is becoming
increasingly popular. Niche directory submission is nothing but choosing
directory submission websites from the relevant niches. For example if you are
running a travel website, you should submit to directory submission websites
that list travel related services. Niche directory submission has several benefits.
The primary benefit has already been discussed. Other benefits of niche
directory submission include the following. When you submit your website to
directories belonging to your niche, you will be able to attract more targeted
traffic. People that come to specific niche related directories reach those
directories with a definitive need. So these are highly prospective visitors.
By making yourself visible to them, you will be able to increase the chances of
conversion.
Moreover, niche specific directory listings have more chances of getting listed
in the search results than listings in general directories because of the
keyword density and related factors. So if you want to get the maximum out of
your directory submission efforts, you must make it a point to spend a
considerable amount of time in niche directory submission. You will have to
find the right service provider to help you with your niche directory
submissions. Not all companies specialize in niche directory submissions as
this is a highly challenging task.
Reciprocal Directories
– A reciprocal link is an agreement between two webmasters to provide
a hyperlink within
their own website to each other's web site. Generally this is done to provide
readers with quick access to related sites, or to show a partnership between
two sites. Reciprocal links can also help to increase traffic to your web
site in two ways. First you will probably have some viewers visit your site
from clicking the
reciprocal link directly. Secondly, most Internet search engines
also take into account the number of web sites which contain links to your web
site; the more hyperlinks to your site found, the higher up in the search
engine rankings (depending on the search term) you'll find your site.
Download Sites List
– From Internet (sites.submit-everywhere.com)
Directories of Directories- Section Directory of Directories provides list of different
directories sorted by categories. For easy searching for directories all
categories are devided to four main groups: General Directories, Special
Directories, Niche Directories and Regional Directories. Directory of
Directories assumes paid submissions of directories in accordance with the
criterias of GreatDirectories.org. We also recommend to review the following
regularly updated lists of quality directories: Paid directories, Free
directories, Blog directories and Article directories.
Classified- Classified
advertisement is arranged according to specific categories or classifications.
The text of the advertisements is set in the same size and style of type and
the ads are usually without illustration. The three major headings are
Employment, Real Estate, and Automotive, although there are many additional
categories (e.g., Business Opportunities, Lost and Found, Pets, Personals, and
Legal Notices). Classified advertising is usually located in its own separate
section of the publication and has its own rate card.
Five Alternative Gigs Sites List
Free Directories List
Bid Directory List
Article Directory List
Social Bookmarking List
Press Release Sites List
Type F, P, R, D Refers F=Free, P=Paid, R=Reciprocal, D=Deep Links
Internet marketing Social Media , Bulk Emailing Skills, Newsletter, Basic HTML is mandatory, Market company and client websites, Ability
to update Websites.
SEO Meta Tags,
Keyword Analysis, Daily Reporting,
SEO / SEM / Link Building, Blogs.
a)On Page and Off Page Optimization both.
b)Keyboard Analysis and Web Page Optimization
c)Article and Blog Creation and Posting
d)Posting on Forums, Blogs and Social Networking Sites
e)Competitive Analysis
f)Developing traffic strategies
g)Implementing the strategies for Company Websites,
h)Documentation of traffic Strategies,
i)Manage Websites
j)Ethical SEO & Link Building
k)Excellent knowledge of SEO reporting & SEO tracking
l)SMO Executive expert in complete SMO activities will be
an added advantage
m)Managing PR/Forum Discussions, Business Listings,
Article Submission, Classified Ads, Directory, Blogs, Commenting, RSS 5 Social
Media Optimization.
n)Managing Google adwords campaigns would be an add-on
o)Analytical thinking is required
p)work on multiple projects.
q)Google Places Local Listing
3.Ranking on Local Keywords
4.Keyword Analysis
5.Directory Submission
6.Article Submission
7.Blog Submission
8.Improve Keyword Performance and Search Engine Rankings.
9.Increase Traffic and Lead Generation.
10.Manage SEO processes for different client domains.
11.Manage Google AdWords/PPC Campaigns.
12.Increase Quality Score, CTR & Conversions. 13.working
with ON-Page/OFF-Page Optimization, Link Building, Directory & Article
Submissions, Blog/Forum Commenting etc.
Knowledge
of Google PPC, Analytics and SMO would be an added advantage.
Whenever you
enter a query in a search engine and hit 'enter' you get a list of web results
that contain that query term. Users normally tend to visit websites that are at
the top of this list as they perceive those to be more relevant to the query.
If you have ever wondered why some of these websites rank better than the
others then you must know that it is because of a powerful web marketing
technique calledSearch Engine
Optimization (SEO).
SEO is a
technique which helps search engines find and rank your site higher than the
millions of other sites in response to a search query. SEO thus
helps you get traffic from search engines.
This SEO
tutorial covers all the necessary information you need to know about Search
Engine Optimization - what is it, how does it work and differences in the
ranking criteria of major search engines.
1. How Search Engines Work
SEO is that search engines are not humans.
While this might be obvious for everybody, the differences between
how humans and search engines view web pages aren't. Unlike humans, search
engines are text-driven.
Instead, search engines crawl the Web, looking at particular site items
(mainly text) to get an idea what a site is about.
search engines perform several activities in order to deliver
search results –crawling, indexing,processing,calculating relevancy, andretrieving.
First,
search enginescrawlthe Web to see what is there. This
task is performed by a piece of software, called acrawleror aspider(or Googlebot, as is the case with
Google). Spiders follow links from one page to another and index everything
they find on their way. Having in mind the number of pages on the Web (over 20
billion), it is impossible for a spider to visit a site daily just to see if a
new page has appeared or if an existing page has been modified, sometimes
crawlers may not end up visiting your site for a month or two.
What you can do is to check what a crawler sees from your site. As
already mentioned, crawlers are not humans and they do not see images, Flash
movies, JavaScript, frames, password-protected pages and directories, so if you
have tons of these on your site, you'd better run theSpider Simulatorbelow to
see if these goodies are viewable by the spider. If they are not viewable, they
will not be spidered, not indexed, not processed, etc. - in a word they will be
non-existent for search engines.
Top of Form
Spider
Simulator
Enter
URL to Spider
Bottom of Form
After a page is crawled, the next step is toindexits content. The indexed page is stored in a
giant database, from where it can later be
retrieved. Essentially, the process of indexing is identifying the words and expressions that best
describe the page and assigning the page to particular
keywords.
When a search request comes, the search engineprocessesit – i.e. it compares
the search string in the search request with the indexed pages in the database.
Since it is likely that more than one page (practically it is millions of
pages) contains the search string, the search engine startscalculating the relevancyof each of the pages in its index with
the search string.
There are various
algorithms to calculate relevancy. Each of these algorithms has
different relative weights for common factors like keyword density, links, or
metatags. That is why different search engines give different search
results pages for the same search string. What is more, it is a known fact that
all major search engines, like Yahoo!, Google, Bing, etc. periodically change
their algorithms and if you want to keep at the top, you also need to adapt
your pages to the latest changes. This is one reason (the other is your
competitors) to devote permanent efforts to SEO, if you'd like to be at the
top.
The last step
in search engines' activity isretrievingthe results. Basically,
it is nothing more than simply displaying them in the browser – i.e. the
endless pages of search results that are sorted from the most relevant to the
least relevant sites.
2. Differences Between the
Major Search Engines
The basic
principle of operation of all search engines is the same, the minor differences
between them lead to major changes in results relevancy.
For
different search engines different factors are important.
There are
many examples of the differences between search engines. For instance, for Yahoo!
and Bing, on-page keyword factors are of primary importance, while for Google
links are very, very important.
Also, for Google sites are like wine – the
older, the better.
II. Keywords – the Most Important Item in SEO
Keywords are
the most important SEO element for every search engine, they are what search
strings are matched against.
If you fail
on this very first step, the road ahead is very bumpy and most likely you will
only waste your time and money.
There are many ways to determine which keywords
to optimize for and usually the final list of them is made after a careful
analysis of what the online population is searching for, which keywords have
your competitors chosen and above all - which are the keywords that you feel
describe your site best.
1. Choosing the Right
Keywords to Optimize For
It seems
that the time when you could easily top the results for a one-word search
string is centuries ago. Now, when the Web is so densely populated with sites,
it is next to impossible to achieve constant top ratings for a one-word search
string. Achieving constant top ratings for two-word or three-word search
strings is a more realistic goal.
For
instance, If you have a site about dogs, do NOT try and optimize for the
keyword "dog" or "dogs". Instead you could try and focus on
keywords like "dog obedience training", "small dog breeds",
"homemade dog food", "dog food recipes" etc. Success for
very popular one-two word keywords is very difficult and often not worth the
trouble, it's best to focus on less competitive highly specific keywords.
The first
thing you need to do is come up with keywords that describe the content of your
website. Ideally, you know your users well and can correctly guess what search
strings they are likely to use to search for you. You can also try theWebsite Keyword Suggestions Toolbelow to
come up with an initial list of keywords. Run your inital list of keywords by
theGoogle
keyword Suggestion tool, you'll get a related list of keywords,
shortlist a couple of keywords that seem relevent and have a decent global
search volume.
Top of Form
Website
Keyword Suggestions
Enter
Website URL / Domain
Bottom of Form
When
choosing the keywords to optimize for, you need to consider not only the
expected monthly number of searches but also the relevancy of these keywords to
your website. Although narrow keywords get fewer searches they are a lot more
valuable than generic keywords because the users would be more interested in
your offerings. Lets say you have a section on your website where you give
advice on what to look for when adopting a dog. You might discover that the
"adopt german shepherd" keyphrase gives you better results than a
keyword like "german shepherd dogs". This page is not of interest to
current german shepherd owners but to potential german shepherd owners only. So, when
you look at the numbers of search hits per month, consider the unique hits that
fit into the theme of your site.
2.
Keyword Density
Although
there are no strict rules, try optimizing for a reasonable number of keywords –
5 or 10 is OK. If you attempt to optimize for a list of 300, you will soon see
that it is just not possible to have a good keyword density for more than a few
keywords, without making the text sound artificial and stuffed with keywords.
And what is worse, there are severe penalties (including ban from the search
engine) forkeyword stuffingbecause this is considered an unethical
practice that tries to manipulate search results.
3. Keywords in Special
Places
Keywords are
very important not only as quantity but as quality as well – i.e. if you have
more keywords in the page title, the headings, the first paragraphs – this
counts more that if you have many keywords at the bottom of the page. The
reason is that the URL (and especially the domain name), file names and
directory names, the page title, the headings for the separate sections are
more important than ordinary text on the page and therefore, all equal, if you
have the same keyword density as your competitors but you have keywords in the
URL, this will boost your ranking incredibly, especially with Yahoo!.
a. Keywords in URLs and File Names
The
domain name and the whole URL of a site tell a lot about it. The presumption is
that if your site is about dogs, you will have “dog”, “dogs”, or “puppy” as
part of your domain name. For instance, if your site is mainly about adopting
dogs, it is much better to name your dog site “dog-adopt.net” than
“animal-care.org”, for example, because in the first case you have two major
keywords in the URL, while in the second one you have no more than one
potential minor keyword.
When
hunting for keyword rich domain names, don't get greedy. While from a SEO point
of view it is better to have 5 keywords in the URL, just imagine how long and
difficult to memorize the URL will be. So you need to strike a balance between
the keywords in the URL and site usability, which says that more than 3 words
in the URL is a way too much.
Probably you will not be able to come on your own with tons of good
suggestions. Additionally, even if you manage to think of a couple of good
domain names, they might be already taken. In such cases tools like the Tool
below can come very handy.
Top of Form
Enter Keyword
Choose
Your Domain Extensions:
.com
.net
.org
.info
.biz
.us
.name
.in
Bottom of Form
File names
and directory names are also important. The
advantage of keywords in file names over keywords in URLs is that they are
easier to change, if you decide to move to another niche, for example.
b. Keywords in Page Titles
The page
title is another special place because the contents of the
<title> tag usually gets displayed in most search engines, (including
Google). While it is not mandatory per the HTML specification to write
something in the <title> tag (i.e. you can leave it empty and the title
bar of the browser will read “Untitled Document” or similar), for SEO purposes
you may not want to leave the <title> tag empty; instead, you'd better write the
the page title in it.
Unlike URLs,
with page titles you can get wordy. If we go on with the dog example, the
<title> tag of the home page for thehttp://dog-adopt.netcan include something like this:
<title>Adopt a Dog – Save a Life and Bring Joy to Your
Home</title>, <title>Everything You Need to Know About Adopting a
Dog</title> or even longer.
There are no
technical length limits for the contents of the <h1>, <h2>,
<h3>, ... <hn> tags but common sense says that too long headings
are bad for page readability. So, like with URLs, you need to be wise with the
length of headings. Another issue you need to consider is how the heading will
be displayed. If it is Heading 1 (<h1>), generally this means larger font
size and in this case it is recommendable to have less than 7-8 words in the
heading, otherwise it might spread on 2 or 3 lines, which is not good and if
you can avoid it – do it.
III. Backlinks – Another Important
SEO Item
What are
Backlinks?
In layman's terms, there are
two types of links: inbound and outbound. Outbound links start from your site and lead to an external site,
while inbound links orbacklinks, come
from an external site to yours. e.g. if cnn.com links to yourdomain.com, the
link from cnn.com is a backlink (inbound) for yourdomain.com, however the link
is an outbound link from cnn.com's perspective.
Backlinks are among the main
building blocks to good Search Engine Optimisation (SEO).
Why
Backlinks Are Important
The number
of backlinks is an indication of the popularity or importance of that website.
Backlinks are important for SEO because some search engines like Google,
give more credit to websites that have a large number of quality backlinks, and
consider those websites more relevant than others in their results pages for a
search query.
Therefore,
when search engines calculate the relevance of a site to a keyword, they not
only consider the number of backlinks to that site but also their quality. In
order to determine the quality, a search engine considers the content of the
sites. When backlinks to your site come from other sites, and those sites have
content related to your site, these backlinks are considered more relevant to your
site. If backlinks are found on sites with unrelated content, they are
considered less relevant. The higher the relevance of backlinks, the greater
their quality.
For example,
if a webmaster has a website about how to rescue orphaned dogs, and received a
backlink from another website about dogs, then that would be more relevant in a
search engine's assessment than say a link from a site about car racing.
Therefore, higher the relevance of the site linking back to your website, the
better the quality of the backlink.
Search
engines want websites to have a level playing field, and look for natural links
built slowly over time. While it is fairly easy to modify your webpages to make
them more SEO friendly it is a lot harder for you to influence other websites
and get them to link to your website. This is the reason search engines regard
backlinks as a very important factor.
Anchor Text
When a link incorporates a keyword into the text of the hyperlink,
we call this anchor text.
A link's
anchor text may be one of the most powerful resources a webmaster has.
Backlinks from multiple websites with the anchor text "orphaned dogs"
would help your website rank higher for the keyword "orphaned dogs".
Using your keyword is a superior way to utilize a hyperlink as against having
links with words like "click here" which do not relate to your
website. The'Backlink
Anchor Text Analysis Tool'is a tool
which will assist you find your backlinks and the text which is being used to
link to your website. If you find that your site is being linked to from
another website, but the anchor text is not being utilized properly, you should
request that the website change the anchor text to something which incorporates
relevant keywords. This will also help boost your rankings.
Top of Form
Backlink
Anchor Text Analyzer
Domain Name
Note*Results
may vary if prefixed with www.
Bottom of Form
Ways to Build Backlinks
Even if
plenty of backlinks come to your site the natural way, additional quality
backlinks are always welcome.
1The Backlink Builder Tool
When you enter
the keywords of your choice, theBacklink Buildertool gives you a list of relevent
sites from where you might get some backlinks.
2Getting Listed in Directories
If you are
serious about your Web presence, getting listed in directories like DMOZ and
Yahoo is a must, not only because this is a way to get some quality backlinks
for free, but also because this way you are easily noticed by both search
engines and potential visitors. Generally inclusion in search directories is
free but the drawback is that sometimes you have to wait a couple of months
before you get listed in the categories of your choice.
3Forums and Article Directories
Generally
search engines index forums so posting in forums and blogs is also a way to get
quality backlinks with the anchor text you want. If the forum or blog is a
respected one, a backlink is valuable. However, in some cases the forum or blog
administrator can edit your post, or even delete it if it does not fit into the
forum or blog policy. Also, sometimes administrators do not allow links in
posts, unless they are relevant ones. In some rare cases (which are more an
exception than a rule) the owner of a forum or a blog would have banned search
engines from indexing it and in this case posting backlinks there is pointless.
4RSS Feeds
You can
offer RSS feeds to interested sites for free, when the other site publishes
your RSS feed you will get a backlink to your site and potentially a lot of
visitors, who will come to your site for more details about the headline and
the abstract they read on the other site.
5Affiliate programs
Affiliate
programs are also good for getting more visitors (and buyers) and for building
quality backlinks but they tend to be an expensive way because generally the
affiliate commission is in the range of 10 to 30 %. But if you have an
affiliate program anyway, why not use it to get some more quality backlinks?
6News Announcements and Press Releases
Although
this is hardly an everyday way to build backlinks, it is an approach that gives
good results, if handled properly. There are many sites that publish news
announcements and press releases for free or for a small fee . A professionally
written press release about an important event can bring you many visitors and
the backlink from a respected site to yours is a good boost to your SEO
efforts. The tricky part is that you cannot release press releases if there is
nothing newsworthy. That is why we say that news announcements and press
releases are not a commodity way to build backlinks.
Link Practices That Are To
Be Avoided
There is
much discussion in these last few months about reciprocal linking. In the past
few Google updates, reciprocal links were one of the targets of the search
engine's latest filter. Many webmasters had agreed upon reciprocal link
exchanges, in order to boost their site's rankings. In a link exchange, one
webmaster places a link on his website that points to another webmasters
website, and vice versa. Many of these links were simply not relevant, and were
just discounted. So while the irrelevant backlinks were ignored, the outbound
links still got counted, diluting the relevancy score of many websites. This
caused a great many websites to drop off the Google map.
There is
a Google patent in the works that will deal with not only the popularity of the
sites being linked to, but also how trustworthy a site is that you link to from
your own website. This will mean that you could get into trouble with the
search engine just for linking to a bad apple.
Many
webmasters have more than one website. Sometimes these websites are related,
sometimes they are not. You have to also be careful about interlinking multiple
websites on the same IP. If you own seven related websites, then a link to each
of those websites on a page could hurt you, as it may look like to a search
engine that you are trying to do something fishy. Many webmasters have tried to
manipulate backlinks in this way; and too many links to sites with the same IP
address is referred to as backlink bombing.
One thing
is certain, interlinking sites doesn't help you from a search engine
standpoint. The only reason you may want to interlink your sites in the first
place might be to provide your visitors with extra resources to visit. In this
case, it would probably be okay to provide visitors with a link to another of
your websites, but try to keep many instances of linking to the same IP address
to a bare minimum. One or two links on a page here and there probably won't
hurt you.
IV. Metatags
Meta tags
are used to summarize information of a page for search engine crawlers. This
information is not directly visibles to humans visiting your website. The most
popular are the meta keywords and description tag. These meta tags to be
inserted into the area of your page.
A couple of
years ago meta tags were the primary tool for search engine optimization and
there was a direct correlation between keywords in the meta tags and your
ranking in the search results. However, algorithms have got better and today
the importance of metadata is decreasing day by day.
Meta Description
The meta
Description tag is are one more way for you to write a description of your
site, thus pointing search engines to what themes and topics your Web site is
relevant to. Some search engines (including Google) use these meta description
display a summary of the listings on the search results page. So if your meta
descriptions are well written you might be able to attract more traffic to your
website.
For
instance, for the dog adoption site, the meta Description tag could be
something like this:
<Meta Name=“Description“ Content=“Adopting a dog saves a life and brings joy
to your house. All you need to know when you consider adopting a dog.“>
Meta Keywords
A
potential use of the Meta Keywords tags is to include a list of keywords that
you think are relevant to your pages. The major search engines will not take
this into account but still it is a chance for you to emphasize your target
keywords. You may consider including alternative spellings (or even common
misspellings of your keywords) in the meta Keywords tag. It might be a very
small boost to your search engine rankings but why miss the chance?
eg.
<Meta name=“Keywords“ Content=“adopt, adoption, dog, dogs, puppy, canine,
save a life, homeless animals“>
Meta Robots
In this tag
you specify the pages that you do NOT want crawled and indexed. It happens that
on your site you have contents that you need to keep there but you don't want
it indexed. Listing this pages in the Meta Robots tag is one way to exclude
them (the other way is by using a robots.txt file and generally this is the
better way to do it) from being indexed.
eg. <META
NAME=“ROBOTS“ CONTENT=“NOINDEX, NOFOLLOW“>
V. Content Is King
If you were
writing SEO text solely for machines, optimization would be simple. Sprinkle in
some keywords, rearrange them at random and watch the hit counter skyrocket.
Sometimes SEO copy writers forget that this isn't the case. Real people read
your text and they expect something in return for the time and attention they
give you. They expect good content, and their expectations have shaped how
search engines rank your site.
What Is Good Content?
Good SEO content has three primary characteristics:
Offers useful information presented in an engaging
format to human readers
Boosts search engine rankings
Attracts plenty of links from other sites
Note that
human readers come first on the list. Your site must deliver value to its
visitors and do it in an engaging way. Few sites specialize in a subject so
narrow that they have an information niche all to themselves. You'll have
competition. Set yourself apart from it with expert interviews, meaningful
lists and well-researched resources. Write well or invest in someone who does;
your investment will pay off in increased traffic.
Although
search engines aren't your primary audience, they still influence your page
rankings. In the days of early SEO, using keyword-stuffed META tags brought in
plenty of traffic. People didn't hang around on a site that promised low air
fares and delivered advertisements, but that didn't affect the search engines.
Each iteration of the engines' algorithms got better at discerning valuable
sites from clutter, though, so site creators had to sharpen their technique as
well. Instead of META tags, they used keywords sprinkled throughout an article.
In April
2011, Google's algorithm change devalued keyword and keyphrase "spam"
in favor of more nuanced means of determining a web site's value to viewers. This
update sent ripples throughout the Internet. From major commerce sites to
hobbyists' blogs, search engines boosted high-value sites and cast down some
once-mighty sites that relied too much on keyword-stuffing. Keywords haven't
lost their value, but they no longer provide the only cue to search engines.
If SEO
keywords have become devalued, links have grown in value. If other sites link
to yours as an engaging read, controversial screed or authoritative text,
search engines view your page as a site that viewers will want to see and bump
it up accordingly. Filling your site with link bait will get you noticed by
search engines and the people who use them, and the best way to draw links is
with strong, fresh content. Social media sites provide even more buzz for pages
with great content. Those links count too, so court them with content-rich
pages.
Writing SEO Content for
Search Engines -- And for People
Search
engine programmers still use this research to devise algorithms that provide
more organic and meaningful rankings.
The same
things that catch a visitor's eye will get a search engine's attention. The
upper left corner of the page is the most valuable real estate on the page, as
it's where a reader's eyes go first. Put important text there so search engines
and people will see it immediately. It's also a good spot for boxed text and
itemized lists, both of which appeal equally to carbon-based and silicon-based
brains.
Bold text
makes people and machines notice, but use those tags judiciously. Too much bold
text looks like an advertisement and will cause search engines to devalue your
site. Italic text bold HTML tags should surround meaningful concepts, not
emphasis words. Bolding a "very" or italicizing a "more"
means nothing to a search engine, so apply those tags to important concepts and
sub-headings.
Happily,
there's a way to work these terms into your content without monitoring keyword
and keyphrase percentages: simply write the kind of engaging copy that people
like to read. If you write for readers, the search engines will follow.
SEO Killers - Duplicate
Content, Spam and Filler
You have
a handle on what modern SEO content should be, but it's also vital to
understand what it shouldn't be. Nielsen's research described what kept readers
on web sites and shed light on what drove them away. Search engines take these
same factors into account and rank pages down or even remove them from ranking
altogether.
Duplicate
content can sink a site. Even legally obtained duplicate content such as
articles linked whole from news feeds and large blocks of attributed quotes
diminish a site's SEO value. Readers have no reason to visit a site that gives
them other sites' news verbatim. Page ranks will decline over time without
original content.
While you
don't want large blocks of duplicate content on your site, you want the timely
information that your news feeds deliver. Build fresh new content on the
foundation of other information whenever possible. It takes more effort to
assimilate and summarize a news story or to use it as a link within an original
article, but doing so will cast your site in a more positive light. If you add
sufficient value with sharp writing and relevant links, you'll find yourself in
the search engine stratosphere.
The old
method of following keyword formulas and meeting keyword percentages is not
only outdated, it will actively lower your site's rank. Heavy keyword-loading
is the hallmark of advertising web sites, and search engines know it. Using
related words and relevant phrases to enhance topic recognition marks your site
as valuable and drives its search engine value higher. Varied writing is also
more readable to your human visitors.
Nielsen
found that human readers shunned sites full of filler phrases. Clear, concise
web writing has greater value than sprawling pages full of fluff. Hyperbole and
promotional language -- describing a product as "the best ever" or
"the perfect solution," for example -- contributes nothing to the
meaning of the text. Human readers filter out fluff and software ranks down
sites with too much of it, so eliminate it from your site.
Search
engines change their algorithms regularly in an effort to provide their users
with more relevant results. The state of SEO art changes with them. The only
constant in web writing is its human audience. Pages that provide novel,
appealing content in a reader-friendly format will rise to the top of the
rankings.
Try theSimilar
Page Checkerto check
the similarity between two URLs.
Top of Form
Similar
Page Checker
Enter
First URL
Enter
Second URL
Bottom of Form
VI. Visual Extras and SEO
As already mentioned, search engines
have no means to index directly extras like images, sounds, flash movies,
javascript. Instead, they rely on your to provide meaningful textual
description and based on it they can index these files. In a sense, the
situation is similar to that with text 10 or so years ago – you provide a
description in the metatag and search engines uses this description to index
and process your page. If technology advances further, one day it might be
possible for search engines to index images, movies, etc. but for the time
being this is just a dream.
1. Images
Images are an essential
part of any Web page and from a designer point of view they are not an extra
but a most mandatory item for every site. However, here designers and search
engines are on two poles because for search engines every piece of information
that is buried in an image is lost. When working with designers, sometimes it
takes a while to explain to them that having textual links (with proper anchor
text) instead of shining images is not a whim and that clear text navigation is
really mandatory. Yes, it can be hard to find the right balance between
artistic performance and SEO-friendliness but since even the finest site is
lost in cyberspace if it cannot be found by search engines, a compromise to its
visual appearance cannot be avoided.
With all
that said, the idea is not to skip images at all. Sure, nowadays this is
impossible because the result would be a most ugly site. Rather the idea is
that images should be used for illustration and decoration, not for navigation
or even worse – for displaying text (in a fancy font, for example). And the
most important – in the <alt> attribute of the <img> tag, always
provide a meaningful textual description of the image. The HTML specification
does not require this but search engines do. Also, it does not hurt to give
meaningful names to the image files themselves rather than name them
image1.jpg, image2.jpg, imageN.jpg. For instance, in the next example the image
file has an informative name and the alt provides enough additional
information: <img src=“one_month_Jim.jpg” alt=“A picture of Jim when he was
a one-month puppy”>. Well, don't go to extremes like writing 20-word
<alt> tags for 1 pixel images because this also looks suspicious and
starts to smell like keyword-stuffing.
2. Animation and Movies
The
situation with animation and movies is similar to that with images – they are
valuable from a designer's point of view but are not loved by search engines.
For instance, it is still pretty common to have an impressive Flash
introduction on the home page. You just cannot imagine what a disadvantage with
search engines this is – it is a number one rankings killer! And it gets even
worse, if you use Flash to tell a story that can be written in plain text,
hence crawled and indexed by search engines. One workaround is to provide
search engines with a HTML version of the Flash movie but in this case make
sure that you have excluded the original Flash movie from indexing (this is
done in the robots.txt file but the explanation of this file is not a beginners
topic and that is why it is excluded from this tutorial), otherwise you can be
penalized for duplicate content.
There are
rumors that Google is building a new search technology that will allow to
search inside animation and movies and that the .swf format will contain new
metadata that can be used by search engines, but until then, you'd better
either refrain from using (too much) Flash, or at least provide a textual
description of the movie (you can use an <alt> tag to describe the
movie).
3. Frames
It is a
good news that frames are slowly but surely disappearing from the Web. 5 or 10
years ago they were an absolute hit with designers but never with search
engines. Search engines have difficulties indexing framed pages because the URL
of the page is the same, no matter which of the separate frames is open. For
search engines this was a shock because actually there were 3 or 4 pages and
only one URL, while for search engines 1 URL is 1 page. Of course, search
engines can follow the links to the pages in the frameset and index them but
this is a hurdle for them.
If you
still insist on using frames, make sure that you provide a meaningful
description of the site in the <noframes> tag. The following example is
not for beginners but even if you do not understand everything in it, just
remember that the <noframes> tag is the place to provide an alternative
version (or at least a short description) of your site for search engines and
users whose browsers do not support frames. If you decide to use the
<noframes> tag, maybe you'd like to read more about it before you start
using it.
Example:
<noframes> <p> This site is best viewed in a browser that supports
frames. </p><p> Welcome to our site for prospective dog adopters!
Adopting a homeless dog is a most noble deed that will help save the life of
the poor creature. </p></noframes>
4.
JavaScript
This is
another hot potato. It is known by everybody that pure HTML is powerless to
make complex sites with a lot of functionality (anyway, HTML was not intended
to be a programming languages for building Web applications, so nobody expects
that you can use HTML to handle writing to a database or even for storing
session information) as required by today's Web users and that is why other
programming languages (like JavaScript, or PHP) come to enhance HTML. For now search
engines just ignore JavaScript they encounter on a page. As a result of this,
first if you have links that are inside the JavaScript code, chances are that
they will not be spidered. Second, if JavaScript is in the HTML file itself
(rather than in an external .js file that is invoked when necessary) this
clutters the html file itself and spiders might just skip it and move to the
next site. Just for your information, there is a <noscript> tag that
allows to provide alternative to running the script in the browser but because
most of its applications are pretty complicated, it is hardly suitable to
explain it here.
VII. Static Versus Dynamic URLs
Based on the
previous section, you might have gotten the impression that the algorithms of
search engines try to humiliate every designer effort to make a site gorgeous.
Well, it has been explained why search engines do not like image, movies,
applets and other extras. Now, you might think that search engines are far too
cheeky to dislike dynamic URLs either. Honestly, users are also not in love
with URLs likehttp://domain.com/product.php?cid=1&pid=5because such URLs do not tell much about the
contents of the page.
There are a
couple of good reasons why static URLs score better than dynamic URLs.
First, dynamic URLs
are not always there – i.e. the page is generated on request after the user
performs some kind of action (fills a form and submits it or performs a search
using the site's search engine). In a sense, such pages are nonexistent for
search engines, because they index the Web by crawling it, not by filling in
forms.
Second, even if
a dynamic page has already been generated by a previous user request and is
stored on the server, search engines might just skip it if it has too many
question marks and other special symbols in it. Once upon a time search engines
did not index dynamic pages at all, while today they do index them but
generally slower than they index static pages.
The idea
is not to revert to static HTML only.
Database-driven
sites are great but it will be much better if you serve your pages to the
search engines and users in a format they can easily handle. One of the
solutions of the dynamic URLs problem is calledURL
rewriting. There are special tools (different for different platforms and
servers) that rewrite URLs in a friendlier format, so they appear in the
browser like normal HTML pages. Try theURL
Rewriting Toolbelow, it
will convert the cryptic text from the previous example into something more
readable, likehttp://mydomain.com/product-categoryid-1-productid-5.
Top of Form
URL
Rewriting Tool
Enter
Dynamic URL
Bottom of Form
VIII. Promoting Your Site to Increase Traffic
The main purpose of SEO is to make your site visible to search
engines, thus leading to higher rankings in search results pages, which in turn
brings more traffic to your site. And having more visitors (and above all
buyers) isultimately the goal in
sites promotion.
For truth's sake, SEO is only one alternative to promote your site
and increase traffic – there are many other online and offline ways to do
accomplish the goal of getting high traffic and reaching your target audience.
1. Submitting Your Site to
Search Directories, forums and special sites
After you
have finished optimizing your new site, time comes to submit it to search
engines. Generally, with search engines you don't have to do anything special
in order to get your site included in their indices – they will come and find
you. Well, it cannot be said exactly when they will visit your site for the
first time and at what intervals they will visit it later but there is hardly
anything that you can to do invite them. Sure, you can go to their Submit a
Site pages in submit the URL of your new site but by doing this do not expect that
they will hop to you right away. What is more, even if you submit your URL,
most search engines reserve the right to judge whether to crawl your site or
not.
Anyway, here are the URLs for submitting pages
in the three major search engines:Google,MSN, andYahoo.
In addition
to search engines, you may also want to have your site included in search
directories as well. Althoughsearch
directoriesalso list sites
that are relevant to a given topic, they are different from search engines in
several aspects. First, search directories are usually maintained by humans and
the sites in them are reviewed for relevancy after they have been submitted.
Second, search directories do not use crawlers to get URLs, soyouneed to go to them and submit your
site but once you do this, you can stay there forever and no more efforts on
your side are necessary. Some of the most popular search directories are DMOZ
and Yahoo! (the directory, not the search engine itself) and here are the URLs
of their submissions pages:DMOZandYahoo!.
Sometimes
posting a link to your site in the right forums or special sites can do
miracles in terms of traffic. You need to find the forums and sites that are
leaders in the fields of interest to you but generally even a simple search in
Google or the other major search engines will retrieve their names. For
instance, if you are a hardware freak, type “hardware forums” in the search box
and in a second you will have a list of sites that are favorites to other
hardware freaks. Then you need to check the sites one by one because some of
them might not allow posting links to commercial sites. Posting into forums is
more time-consuming than submitting to search engines but it could also be
pretty rewarding.
2. Specialized Search
Engines
Google,
Yahoo!, and MSN are not the only search engines on Earth, nor even the only
general-purpose ones. There are many other general-purpose and specialized
search engines and some of them can be really helpful for reaching your target
audience. You just can't imagine for how many niches specialized search engines
exist – from law, to radiostations, to educational one! Some of them are
actually huge sites that gather Webwide resources on a particular topic but
almost all of them have sections for submitting links to external sites of
interest. So, after you find the specialized search engines in your niche, go
to their site and submit your URL – this could prove more trafficworthy than
striving to get to the top of Google.
3. Paid Ads and Submissions
We have
already mentioned some other alternatives to search engines – forums,
specialized sites and search engines, search directories – but if you need to
make sure that your site will be noticed, you can always resort to paid ads and
submissions. Yes, paid listings are a fast and guaranteed way to appear in
search results and most of the major search engines accept payment to put your
URL in the Paid Links section for keywords of interest to you but you also must
have in mind that users generally do not trust paid links as much as they do
with the normal ones – in a sense it looks like you are bribing the search
engine to place you where you can't get on your own, so think twice about the
pros and cons of paying to get listed.
Seo
Tools
Backlink
Tracker Pro Free tool to check your paid / exchanged links. Get alerts
when your backlinks have been removed or converted to nofollow links.
Search engines are known to penalize websites that containduplicate / similar content.Your content could be similar to
other websites on the Internet, or pages from within your own website could
be similar to each other. This tool allows you to determine the percentage of
similarity between two pages
This tool simulates a search engine crawler by displaying the
contents of a webpage exactly how a search engine would see it. It also displays
the links that a search engine would follow (crawl) when it visits the
webpage.
This tool helps you build a LOT of quality backlinks. It
searches for websites of the theme you specify that contain keyphrases like
"Add link", "Add site", "Add URL", "Add
URL", "Submit URL" etc, most of the results could be potential
backlinks. Text links are important for ranking well in search engines.
Backlink
Summary This tool will give you a summary of your competitors
backlinks.
It is import that a search engine is able to follow any
redirects that you may have set up. This tool helps you determine whether the
redirect you have created issearch
engine friendly.
This tool helps you ensure that your link partners are linking
back to your website. It also determines the anchor text used by your link
partners to link to your website.
Older domains may get a slight edge in search engine rankings.
This tool displays the approximate age of a website on the Internet and allows
you to view how the website looked when it first started. It also helps you
find out the age of your competitor's domains.
Having a KEYWORD-RICH domain name is an important factor for
Search Engine Optimization. This tool will suggest keyword rich domain names.
List of Best and Worst practices for designing a
high traffic website
Thank you slashdot :-)
Here is a checklist of the factors that affect your rankings with
Google, Bing, Yahoo! and the other search engines. The list contains positive,
negative and neutral factors because all of them exist. Most of the factors in
the checklist apply mainly to Google and partially to Bing, Yahoo! and all the
other search engines of lesser importance. If you need more information on
particular sections of the checklist, you may want to read ourSEO tutorial,
which gives more detailed explanations of Keywords, Links, Metatags, Visual
Extras, etc.
Keywords
1
Keywords
in <title> tag
This
is one of the most important places to have a keyword because what is written
inside the <title> tag shows in search results as your page title. The
title tag must be short (6 or 7 words at most) and the the keyword must be
near the beginning.
+3
2
Keywords
in URL
Keywords
in URLs help a lot - e.g. -http://domainname.com/seo-services.html,
where “SEO services” is the keyword phrase you attempt to rank well for. But
if you don't have the keywords in other parts of the document, don't rely on
having them in the URL.
+3
3
Keyword
density in document text
Another
very important factor you need tocheck. 3-7 % for major keywords is
best, 1-2 for minor. Keyword density of over 10% is suspicious and looks more
like keyword stuffing, than a naturally written text.
+3
4
Keywords in anchor text
Also
very important, especially forthe anchor text of inbound links, because
if you have the keyword in the anchor text in a link from another site, this
is regarded as getting a vote from this site not only about your site in
general, but about the keyword in particular.
+3
5
Keywords
in headings (<H1>, <H2>, etc. tags)
One
more place where keywords count a lot. But beware that your page has actual
text about the particular keyword.
+3
6
Keywords
in the beginning of a document
Also
counts, though not as much as anchor text, title tag or headings. However,
have in mind that the beginning of a document does not necessarily mean the
first paragraph – for instance if you use tables, the first paragraph of text
might be in the second half of the table.
+2
7
Keywords
in <alt> tags
Spiders
don't read images but they do read their textual descriptions in the
<alt> tag, so if you have images on your page, fill in the <alt>
tag with some keywords about them.
+2
8
Keywords
in metatags
Less
and less important, especially for Google. Yahoo! and Bing still rely on
them, so if you are optimizing for Yahoo! or Bing, fill these tags properly.
In any case, filling these tags properly will not hurt, so do it.
+1
9
Keyword
proximity
Keyword
proximity measures how close in the text the keywords are. It is best if they
are immediately one after the other (e.g. “dog food”), with no other words
between them. For instance, if you have “dog” in the first paragraph and
“food” in the third paragraph, this also counts but not as much as having the
phrase “dog food” without any other words in between. Keyword proximity is
applicable for keyword phrases that consist of 2 or more words.
+1
10
Keyword
phrases
In
addition to keywords, you can optimize for keyword phrases that consist of
several words – e.g. “SEO services”. It is best when the keyword phrases you
optimize for are popular ones, so you can get a lot of exact matches of the
search string but sometimes it makes sense to optimize for 2 or 3 separate
keywords (“SEO” and “services”) than for one phrase that might occasionally
get an exact match.
+1
11
Secondary
keywords
Optimizing
for secondary keywords can be a golden mine because when everybody else is
optimizing for the most popular keywords, there will be less competition (and
probably more hits) for pages that are optimized for the minor words. For
instance, “real estate new jersey” might have thousand times less hits than
“real estate” only but if you are operating in New Jersey, you will get less
but considerably better targeted traffic.
+1
12
Keyword
stemming
For
English this is not so much of a factor because words that stem from the same
root (e.g. dog, dogs, doggy, etc.) are considered related and if you have
“dog” on your page, you will get hits for “dogs” and “doggy” as well, but for
other languages keywords stemming could be an issue because different words
that stem from the same root are considered as not related and you might need
to optimize for all of them.
+1
13
Synonyms
Optimizing
for synonyms of the target keywords, in addition to the main keywords. This
is good for sites in English, for which search engines are smart enough to
use synonyms as well, when ranking sites but for many other languages
synonyms are not taken into account, when calculating rankings and relevancy.
+1
14
Keyword
Mistypes
Spelling
errors are very frequent and if you know that your target keywords have
popular misspellings or alternative spellings (i.e. Christmas and Xmas), you
might be tempted to optimize for them. Yes, this might get you some more
traffic but having spelling mistakes on your site does not make a good
impression, so you'd better don't do it, or do it only in the metatags.
0
15
Keyword
dilution
When
you are optimizing for an excessive amount of keywords, especially unrelated
ones, this will affect the performance of all your keywords and even the
major ones will be lost (diluted) in the text.
-2
16
Keyword
stuffing
Any
artificially inflated keyword density (10% and over) is keyword stuffing and
you risk getting banned from search engines.
-3
Links - internal, inbound, outbound
17
Anchor
text of inbound links
As
discussed in the Keywords section, this is one of the most important factors
for good rankings. It is best if you have a keyword in the anchor text but
even if you don't, it is still OK. However, don't use the same anchor text
all the time because this is also penalized by Google. Try to use synonyms,
keyword stemming, or simply the name of your site instead
+3
18
Origin
of inbound links
Besides
the anchor text, it is important if the site that links to you is a reputable
one or not. Generally sites with greater Google PR are considered reputable.
Links from poor sites and link farms can do real harm to you, so avoid them
at all costs.
+3
19
Links
from similar sites
Generally
the more, the better. But the reputation of the sites that link to you is
more important than their number. Also important is their anchor text (and
its diversity), the lack/presence of keyword(s) in it, the link age, etc.
+3
20
Links
from .edu and .gov sites
These
links are precious because .edu and .gov sites are more reputable than .com.
.biz, .info, etc. domains. Additionally, such links are hard to obtain.
+3
21
Number
ofbacklinks
Generally
the more, the better. But the reputation of the sites that link to you is
more important than their number. Also important is their anchor text, is
there a keyword in it, how old are they, etc.
+3
22
Anchor
text of internal links
This
also matters, though not as much as the anchor text of inbound links.
+2
23
Around-the-anchor
text
The
text that is immediately before and after the anchor text also matters because
it further indicates the relevance of the link – i.e. if the link is
artificial or it naturally flows in the text.
+2
24
Age
of inbound links
The
older, the better. Getting many new links in a short time suggests buying
them.
+2
25
Links
from directories
Could
work, though it strongly depends on which directories. Being listed in DMOZ,
Yahoo Directory and similar directories is a great boost for your ranking but
having tons of links from PR0 directories is useless or even harmful because
it can even be regarded as link spamming, if you have hundreds or thousands
of such links.
+2
26
Number
of outgoing links on the page that links to you
The
fewer, the better for you because this way your link looks more important.
+1
27
Named
anchors
Named
anchors (the target place of internal links) are useful for internal
navigation but are also useful for SEO because you stress additionally that a
particular page, paragraph or text is important. In the code, named anchors
look like this: <A href= “#dogs”>Read about dogs</A> and “#dogs”
is the named anchor.
+1
28
IP
address of inbound link
Google deniesthat they discriminate against links that
come from the same IP address or C class of addresses, so for Google the IP
address can be considered neutral to the weight of inbound links. However,
Bing and Yahoo! may discard links from the same IPs or IP classes, so it is
always better to get links from different IPs.
+1
29
Inbound
links from link farms and other suspicious sites
Presumably,
this does not affect you, provided the links are not reciprocal. The idea is
that it is beyond your control to define what a link farm links to, so you
don't get penalized when such sites link to you because this is not your
fault. However, some recent changes to the Google algorithm suggest the
opposite. This is why, you must always stay away from link farms and other
suspicious sites or if you see they link to you, contact their webmaster and
ask the link to be removed.
0
30
Many
outgoing links
Google
does not like pages that consists mainly of links, so you'd better keep them
under 100 per page. Having many outgoing links does not get you any benefits
in terms of ranking and could even make your situation worse.
-1
31
Excessive
linking, link spamming
It
is bad for your rankings, when you have many links to/from the same sites
(even if it is not a cross- linking scheme or links to bad neighbors) because
it suggests link buying or at least spamming. In the best case only some of
the links are taken into account for SEO rankings.
-1
32
Outbound
links to link farms and other suspicious sites
Unlike
inbound links from link farms and other suspicious sites, outbound links tobad neighborscan drown you. You need periodically to
check the status of the sites you link to because sometimes good sites become
bad neighbors and vice versa.
-3
33
Cross-linking
Cross-linking
occurs when site A links to site B, site B links to site C and site C links
back to site A. This is the simplest example but more complex schemes are
possible. Cross-linking looks like disguised reciprocal link trading and is
penalized.
-3
34
Single
pixel links
when
you have a link that is a pixel or so wide it is invisible for humans, so
nobody will click on it and it is obvious that this link is an attempt to
manipulate search engines.
-3
Metatags
35
<Description>
metatag
Metatags
are becoming less and less important but if there are metatags that still
matter, these are the <description> and <keywords> ones. Use the
<Description> metatag to write the description of your site. Besides
the fact that metatags still rock on Bing and Yahoo!, the <Description>
metatag has one more advantage – it sometimes pops in the description of your
site in search results.
+1
36
<Keywords>
metatag
The
<Keywords> metatag also matters, though as all metatags it gets almost
no attention from Google and some attention from Bing and Yahoo! Keep the
metatag reasonably long – 10 to 20 keywords at most. Don't stuff the
<Keywords> tag with keywords that you don't have on the page, this is
bad for your rankings.
+1
37
<Language>
metatag
If
your site is language-specific, don't leave this tag empty. Search engines
have more sophisticated ways of determining the language of a page than
relying on the <language>metatag but they still consider it.
+1
38
<Refresh>
metatag
The
<Refresh> metatag is one way to redirect visitors from your site to
another. Only do it if you have recently migrated your site to a new domain
and you need to temporarily redirect visitors. When used for a long time, the
<refresh> metatag is regarded as unethical practice and this can hurt
your ratings. In any case, redirecting through 301 is much better.
-1
Content
39
Unique
content
Having
more content (relevant content, which is different from the content on other
sites both in wording and topics) is a real boost for your site's rankings.
+3
40
Frequency
of content change
Frequent
changes are favored. It is great when you constantly add new content but it
is not so great when you only make small updates to existing content.
+3
41
Keywords
font size
When
a keyword in the document text is in a larger font size in comparison to
other on-page text, this makes it more noticeable, so therefore it is more
important than the rest of the text. The same applies to headings
(<h1>, <h2>, etc.), which generally are in larger font size than
the rest of the text.
+2
42
Keywords
formatting
Bold
and italic are another way to emphasize important words and phrases. However,
use bold, italic and larger font sizes within reason because otherwise you
might achieve just the opposite effect.
+2
43
Age
of document
Recent
documents (or at least regularly updated ones) are favored.
+2
44
File
size
Generally
long pages (i.e. 1,500-2,000 words or more) are not favored, or at least you
can achieve better rankings if you have 3 short (500-1,000 words) rather than
1 long page on a given topic, so split long pages into multiple smaller ones.
On the other hand, pages with 100-200 words of text or less are also disliked
by Google.
+1
45
Content
separation
From
a marketing point of view content separation (based on IP, browser type,
etc.) might be great but for SEO it is bad because when you have one URL and
differing content, search engines get confused what the actual content of the
page is.
-2
46
Poor
coding and design
Search
engines say that they do not want poorly designed and coded sites, though
there are hardly sites that are banned because of messy code or ugly images
but when the design and/or coding of a site is poor, the site might not be
indexable at all, so in this sense poor code and design can harm you a lot.
-2
47
Illegal
Content
Using
other people's copyrighted content without their permission or using content
that promotes legal violations can get you kicked out of search engines.
-3
48
Invisible
text
This
is a black hat SEO practice and when spiders discover that you have text
specially for them but not for humans, don't be surprised by the penalty.
-3
49
Cloaking
Cloaking
is another illegal technique, which partially involves content separation
because spiders see one page (highly-optimized, of course), and everybody
else is presented with another version of the same page.
-3
50
Doorway
pages
Creating
pages that aim to trick spiders that your site is a highly-relevant one when
it is not, is another way to get the kick from search engines.
-3
51
Duplicate
content
When
you have the same content on several pages on the site, this will not make
your site look larger because theduplicate contentpenalty kicks in. To a lesser degree
duplicate content applies to pages that reside on other sites but obviously
these cases are not always banned – i.e. article directories or mirror sites
do exist and prosper.
-3
Visual Extras and SEO
52
JavaScript
If
used wisely, it will not hurt. But if your main content is displayed through
JavaScript, this makes it more difficult for spiders to follow and if
JavaScript code is a mess and spiders can't follow it, this will definitely
hurt your ratings.
0
53
Images
in text
Having
a text-only site is so boring but having many images and no text is a SEO
sin. Always provide in the <alt> tag a meaningful description of an
image but don't stuff it with keywords or irrelevant information.
0
54
Podcasts
and videos
Podcasts
and videos are becoming more and more popular but as with all non-textual
goodies, search engines can't read them, so if you don't have the tapescript
of the podcast or the video, it is as if the podcast or movie is not there
because it will not be indexed by search engines.
0
55
Images
instead of text links
Using
images instead of text links is bad, especially when you don't fill in the
<alt> tag. But even if you fill in the <alt> tag, it is not the
same as having a bold, underlined, 16-pt. link, so use images for navigation
only if this is really vital for the graphic layout of your site.
-1
56
Frames
Frames
are very, very bad for SEO. Avoid using them unless really necessary.
-2
57
Flash
Spiders
don't index the content of Flash movies, so if you use Flash on your site,
don't forget to give it an alternative textual description.
-2
58
A
Flash home page
Fortunately
this epidemic disease seems to have come to an end. Having a Flash home page
(and sometimes whole sections of your site) and no HTML version, is a SEO
suicide.
-3
Domains, URLs, Web Mastery
59
Keyword-rich URLs and filenames
A
very important factor, especially for Yahoo! and Bing.
+3
60
Site
Accessibility
Another
fundamental issue, which that is often neglected. If the site (or separate
pages) is unaccessible because of broken links, 404 errors,
password-protected areas and other similar reasons, then the site simply
can't be indexed.
+3
61
Sitemap
It
is great to have a complete and up-to-datesitemap, spiders love it, no matter
if it is a plain old HTML sitemap or the special Google sitemap format.
+2
62
Site
size
Spiders
love large sites, so generally it is the bigger, the better. However, big
sites become user-unfriendly and difficult to navigate, so sometimes it makes
sense to separate a big site into a couple of smaller ones. On the other
hand, there are hardly sites that are penalized because they are 10,000+
pages, so don't split your size in pieces only because it is getting larger
and larger.
+2
63
Site
age
Similarly
to wine,older sites are respected more. The
idea is that an old, established site is more trustworthy (they have been
around and are here to stay) than a new site that has just poped up and might
soon disappear.
+2
64
Site theme
It is not only keywords in
URLs and on page that matter. The site theme is even more important for good
ranking because when the site fits into one theme, this boosts the rankings
of all its pages that are related to this theme.
+2
65
File Location on Site
File location is important
and files that are located in the root directory or near it tend to rank
better than files that are buried 5 or more levels below.
+1
66
Domains versus subdomains,
separate domains
Having a separate domain is
better – i.e. instead of having blablabla.blogspot.com, register a separate
blablabla.com domain.
+1
67
Top-level domains (TLDs)
Not all TLDs are equal. There
are TLDs that are better than others. For instance, the most popular TLD –
.com – is much better than .ws, .biz, or .info domains but (all equal) nothing
beats an old .edu or .org domain.
+1
68
Hyphens in URLs
Hyphens between the words in
an URL increase readability and help with SEO rankings. This applies both to
hyphens in domain names and in the rest of the URL.
+1
69
URL length
Generally doesn't matter but
if it is a very long URL-s, this starts to look spammy, so avoid having more
than 10 words in the URL (3 or 4 for the domain name itself and 6 or 7 for
the rest of address is acceptable).
0
70
IP address
Could matter only for shared
hosting or when a site is hosted with a free hosting provider, when the IP or
the whole C-class of IP addresses is blacklisted due to spamming or other
illegal practices.
0
71
Adsense will boost your
ranking
Adsense is not related in any
way to SEO ranking. Google will definitely not give you a ranking bonus
because of hosting Adsense ads. Adsense might boost your income but this has
nothing to do with your search rankings.
0
72
Adwords will boost your
ranking
Similarly to Adsense, Adwords
has nothing to do with your search rankings. Adwords will bring more traffic
to your site but this will not affect your rankings in whatsoever way.
0
73
Hosting downtime
Hosting
downtimeis
directly related to accessibility because if a site is frequently down, it
can't be indexed. But in practice this is a factor only if your hosting
provider is really unreliable and has less than 97-98% uptime.
-1
74
Dynamic URLs
Spiders prefer static URLs,
though you will see many dynamic pages on top positions. Long dynamic URLs
(over 100 characters) are really bad and in any case you'd better use a tool
torewrite dynamic URLsin
something more human- and SEO-friendly.
-1
75
Session IDs
This is even worse than
dynamic URLs. Don't use session IDs for information that you'd like to be
indexed by spiders.
-2
76
Bans in robots.txt
If indexing of a considerable
portion of the site is banned, this is likely to affect the nonbanned part as
well because spiders will come less frequently to a “noindex” site.
-2
77
Redirects
(301 and 302)
When
not applied properly,redirectscan hurt a lot – the target page might
not open, or worse – a redirect can be regarded as a black hat technique,
when the visitor is immediately taken to a different page.
Keyword
Research
It all begins with words typed into a search box.
Keyword
research is one of the most important, valuable, and high return activities in
the search marketing field. Ranking for the "right" keywords can make
or break your website. Through the detective work of puzzling out your market's
keyword demand, you not only learn which terms and phrases to target with SEO,
but also learn more about your customers as a whole.
It's
not always about getting visitors to your site, but about getting the right
kind of visitors. The usefulness of this intelligence cannot be
overstated - with keyword research you can predict shifts in demand, respond to
changing market conditions, and produce the products, services, and content
that web searchers are already actively seeking. In the history of marketing,
there has never been such a low barrier to entry in understanding the
motivations of consumers in virtually every niche.