Myths and Sites:
submit and submit
Waste if time
Top 10 myths
Duplicate content: doesn’t punish- but does ignore it
make easy to find prefer version-
help crawl efficiently
be reasonable when re-using content from other sites:
MEta tag – verification
if reachable not a problem- Account/site name/domain name
over submit to directories
using google tools helps – WRONG!
No optimal density
write naturally for users
to many mentions distracts users
don’t hide what you mean
XMLS site map –
doesn’t hurt your rankings
xml sitemaps are useful
keep up to date- sitemaps.org
not everything but still important- keep in mind but not only focus
“over 200 factors”
only a few times a year
no real need
Once ranked don’t touch- myth!
other sites change to catch up
give users fresh content
less than 5% are valid
may work for browsers, but not a ranking issue
disallow will remove yoru site page
more links the better
only goal is getting it first- users
let old urls return 404 so new structure is discovered- use 301
using index and follow – not important
hosting on shared ip will drop rankings- not the case
Debunk myths you hear
How Do I tell Google to indicate preferred content?
when on the same domain –
put preferred url in the site map
your internal linking is consistent –
“Deleted old posts and old categories and tags and still find several not found urls- Will this fix itself?”
We don’t update webmaster tools in real time, they will find over time the correct location. If you removed a page it’s ok to return the 404, it is not going to hurt you. If you can do 301, no limit on 301s on a domain. It should pick up and correct that over the time.
“301 will redirect page rank / link juice to the new page?”
Yes, it does transfer to new pages over time.
Google 3 verse:
Wiz: To make sure page does not get indexed, add no index meta to the page, need to allow google to index the page in order to understand it shouldn’t be indexed.
Covering new tool-
Crawl error sources-