- 238
So you've done some sort of test where you took two new domains, on page optimized them both near identically, optimized the domain for the keyword you seek to rank for and ...
1. Placed 1 .edu link to one domain ...
2. placed 100 links to non .edu domains
Let it marinate and your results were that the domain with the 1 .edu backlink ranked in the serps higher?
Lots of people chasing after domains with high google PR, and "relevant" to their niche ... not sure exactly why all the criteria
I tend to look for sites now that have rss feeds that can be leveraged ... if not you can convert the html page to an rss feed url and aggregate the page where you left your links thru RSS Feed aggregation sites.
90% of the time when I place a link I ...
1. Capture the page's html URL
2. Capture the page's RSS FEED URL
3. Bookmark the page at Deliscious/JumpTags/Technorati/Stumble/Google et al ... ScuttlePlus is also an active bookmark site
4. Submit the RSS FEED URL to Feedage.com / Feedagg and a few other aggregation sites
5. Site submit to Yahoo
6. Pingler or Ping-O-Matic all or most of the above
7. Store the URL's I captured to a CSV file to use some 3rd party automation tools to rinse and repeat on an automated basis - mainly pinging ...
That usually get the bots o' crawlin
1. Placed 1 .edu link to one domain ...
2. placed 100 links to non .edu domains
Let it marinate and your results were that the domain with the 1 .edu backlink ranked in the serps higher?
Lots of people chasing after domains with high google PR, and "relevant" to their niche ... not sure exactly why all the criteria
- - - - - - - - - - - - - - - - - -Two things I've learned from experience:
If you can get a site with an .edu extension to link to you, it is worth about 100 commercial sites that are rated close to your own positioning. It is not easy, but it can be done with some imagination --I did it and I am not especially imaginative.
I tend to look for sites now that have rss feeds that can be leveraged ... if not you can convert the html page to an rss feed url and aggregate the page where you left your links thru RSS Feed aggregation sites.
90% of the time when I place a link I ...
1. Capture the page's html URL
2. Capture the page's RSS FEED URL
3. Bookmark the page at Deliscious/JumpTags/Technorati/Stumble/Google et al ... ScuttlePlus is also an active bookmark site
4. Submit the RSS FEED URL to Feedage.com / Feedagg and a few other aggregation sites
5. Site submit to Yahoo
6. Pingler or Ping-O-Matic all or most of the above
7. Store the URL's I captured to a CSV file to use some 3rd party automation tools to rinse and repeat on an automated basis - mainly pinging ...
That usually get the bots o' crawlin
How important is it to check if a web page is showing up in google's directory before posting links. It would seem counter productive to post links in directories not indexed.
Last edited: