Random Featured Post:

Google Ignoring Some Directories

Some directories have recently been removed from Google's cache, while others are not crawled very deeply. Additionally, some of them have not had their cache dates updated in a great deal of time. Google might be trying to slow the growth of directories by not counting the links from many of them. Make sure you check the cache before paying for a listing.

Some of the directories will have a greater effect on relevancy in MSN or Yahoo! than they do on Google, so even if a directory is not counted by Google, the link price might still be cheap for its effects on other search relevancy algorithms.

Many directory owners are building multiple related directories. Some search algorithms such as Google Hilltop are likely to automatically filter out some of the relevancy score from a second directory if it is easily identifiable as being related to the first directory.

The one-time listing fees make directories exceptionally appealing, but expect that many directories will eventually be ignored by some of the search engines. In my perspective, that cost is factored into the listing prices. I average out the link costs for links from a number of sites. If I can spend $1,000 and get a dozen or two dozen well-trusted links, then that is going to be a good buy for launching a site.

Originally posted 2007-12-09 06:17:06.

Read the rest of this entry »

Warning by Google: Cloaking Can Be Detected07.27.15

Matt Cutts recently blogged about cloaking. From his blog post, he made several things clear:

  1. Google definitely hates cloaking
  2. Google does not care even if a big corporation uses cloaking…they will still dislike cloaking
  3. Websites that utilizes cloaking techniques will be removed from Google's index and banned.
  4. Google can and will detect all kinds of cloaking attempts sooner or later.

What is cloaking?
It is a black-hat (blackhat / black hat / evil) search engine optimization technique which manipulates how a websites will be shown to search engines and to the normal web readers. "Keyword rich" content will be delivered to search engines and the normal pages will be shown to the normal web readers.

This is done by identifying the User-Agent and IP address of the user or spiders. If a spider or bot is identified, then a special script will show them a different version of a page that is stuffed with keywords. Basically, the main purpose of cloaking is to gain higher ranking in certain keywords and deceiving the search engines.

Each and every search engine (especially Google) consider "cloaking" to be evil and they will be removed from the index and banned indefinitely.
More →

Originally posted 2007-12-09 06:17:06.

You might also be interested in the following posts:

Posted in Google, Search Engine Optimizationwith 1 Comment →

The Meaning of PageRank (PR)07.20.15

Google is primarily driven by linkage data.

The Google Toolbar provides a 0-10 logarithmic scale to mimic the link popularity of pages. PageRank helps provide a quick glance how important Google thinks a page is.

Google would like you to believe that their PageRank algorithm is the core of their search technology, but they also use many other technologies to improve their search relevancy.

Many webmasters exchange links with as many people as they can, but there is an opportunity cost to everything you do. There are algorithms for sorting good links from bad links. Many link schemes increase your risk profile much quicker than they increase your potential rewards. When you link into the wrong circles, you run the risk of being associated with them.

It is important to note that this PageRank value is only one component of the Google search engine algorithm. Many times, a PR 4 site will rank above a PR 6 site because it was optimized better and has a well-defined descriptive inbound link profile, which means better, more natural links from more sites (and more related sites).

Originally posted 2008-03-03 04:35:06.

You might also be interested in the following posts:

Posted in Google, Search Engine Optimizationwith No Comments →

Internal Duplicate Content07.13.15

You most likely have heard of duplicate content but are you aware of "internal duplicate content"?

This is when two or more pages on your site have almost the exact same content and it happens a lot more then you may realize.

Google absolutely hates internal duplicate content and if you are using some of the more popular free content management systems you might be victim of it and not even be aware of it.
More →

Originally posted 2007-04-29 03:46:04.

You might also be interested in the following posts:

Posted in Search Engine Optimizationwith Comments Off on Internal Duplicate Content

Overcoming Google's Filter for New Websites07.06.15

Have you noticed that it is easer to get ranked higher on Google with older websites than it is with new websites? Have you ever wondered why is this so and what does it take for you to do to get a high ranking if you have a new website?

Why is it easier to get high rankings with older websites?

Spammers oftenly use brand new domain names to make a quick profit. Spammers often buy hundreds of domains at once and fill them with crap or scraper content and hoping to make a profit with the advertisements that appears on the sites (usually spammers use AdSense to put ads on these kind of sites). Black-hat webmasters also sometimes use new domains to test for new search engines exploit (mainly on spamming techniques)

It is obviously difficult for Google to know which sites (or domains) can be trusted and this is why Google had invented a filter system which down-ranks new websites until Google decides they can be trusted.
More →

Originally posted 2007-10-17 13:12:40.

You might also be interested in the following posts:

Posted in Google, Link Building, Search Engine Optimization, Traffic Generationwith 7 Comments →

How To Use A Tell A Friend Script To Drive Traffic06.29.15

More and more webmasters have the recurring dilemma on how to increase the flow of traffic in the websites. During the past few years many methods that been developed to solve this predicament. While most of them would work there are those that would not make even a small impact.

One of the methods that have spawned many success stories in driving traffic into websites is viral marketing. Viral marketing makes use of the tendency of a person to share something to find informative, entertaining or amazing.

Many companies bank on this behavior to spread their products and increase the popularity of their company or their website. Viral marketing makes use of many mediums in enticing this behavior. It might be in the form of an interesting story, an addicting flash game, an amusing video and many others that may catch a person's fancy.

This ingenious form of marketing is typically low cost and is a wonderful tool for any company to utilize. The benefit greatly overshadows the cost or efforts to initialize this marketing scheme. Any website would greatly benefit that viral marketing.
More →

Originally posted 2007-03-03 19:08:46.

You might also be interested in the following posts:

Posted in Traffic Generationwith 1 Comment →

Optimizing Your Page Copy06.22.15

Optimize Each Page

One of the most important things to understand is that each page is its own unit and has its own ranking potential and its own relevant keywords. Usually a home page has more value than the other pages since it is typically the easiest place to build links to. Home pages should generally be optimized for the most relevant competitive keyword phrases in your market that you feel you would be able to rank for. Interior pages should be optimized for other relevant phrases that relate to the content of each page.
More →

Originally posted 2006-12-18 23:29:34.

You might also be interested in the following posts:

Posted in Search Engine Optimizationwith 1 Comment →

When Algorithm Changes Occur06.15.15

Your rankings will improve. They will also get worse. Many people rush off to change things right away when the algorithms change. Sometimes the search engines roll in new algorithms aggressively, and then later roll them back. They cannot fight off new forms of spam and determine how aggressive to be with new algorithms unless they sometimes go too far with them.

If you are unsure of what just happened, then you may not want to start changing things until you figure it out. Sometimes when algorithms are rolled back or made less aggressive, many sites still do not rank well because their webmasters changed things that were helping them. Nobody is owed a good rank, and just because a ranking temporarily changes does not mean that a site has been penalized. It is far more likely that the ranking criteria shifted and the site may not match the new ranking criteria as well as it matched the old ranking criteria.
More →

Originally posted 2007-12-28 20:00:25.

You might also be interested in the following posts:

Posted in Search Engine Optimizationwith Comments Off on When Algorithm Changes Occur

How do I Know What Sites are Good?06.08.15

First off, common sense usually goes pretty far. If a page or site links to a bunch of off-topic or low-quality garbage, you can feel safe, assuming the page does not pass link authority. If you have doubts, you probably do not want to link.

Secondly, Google has a toolbar that shows how it currently views a web page or website. The Google toolbar is one of the top search engine optimization tools for a person new to search engine marketing. It works on Windows and is downloadable at http://toolbar.google.com/.

PageRank is a measure of link popularity, which can come and go. It's not hard for a successful business to rent a few high PageRank links into their site and then leverage that link popularity for link exchanges. A site with decent PageRank can get penalized just the same as a site with low PageRank. Usually, you will want to error on the side of caution off the start.

Instead of making PageRank your primary criteria when evaluating a page or site, just think of it as a baseline.
More →

Originally posted 2008-03-13 10:18:43.

You might also be interested in the following posts:

Posted in Google, Link Building, Search Engine Optimization, Tools, Web Toolswith 3 Comments →

Why is Blogging Such an Effective SEO Strategy?06.01.15

Over time popular bloggers build up hundreds or thousands of subscribers. These subscribers are people interested in the field or topics that blogger writes about. Each time those bloggers write a new post those readers get notified of the new content in their RSS readers.

Imagine that"¦

  • everytime you had an idea to share, that 5,000 people who trust your opinions see it.
  • most of those 5,000 also write blogs in your field or related fields.
  • some of those bloggers frequently mention your site
  • some of those bloggers also have thousands of subscribers.

Blogs are all about spreading ideas (via in content links) and accumulating attention. Where people go search engines follow. If many people link to your blog posts that also boosts your search engine rankings for the other parts of your site.

If you write a frequently updated blog the media is more likely to believe you are a topical expert than if you are just a merchant selling goods in your industry.

Originally posted 2008-02-13 07:36:26.

You might also be interested in the following posts:

Posted in Link Building, Search Engine Optimization, Traffic Generationwith Comments Off on Why is Blogging Such an Effective SEO Strategy?

  • Categories

  • Archives