Jeremy Rivera SEO

ORGANIC ONLINE MARKETING

Websynthesis; growing a healthy website.

Before you commit to planting a website, make sure that you have enough time to commit to the project. Without proper attention over time, your website, links and content will get stale and rot. A good amount of time would be an hour or two each week to make sure you get a successful and blooming website.

Choose carefully when you start out making your website, some breeds require far more skill and handling than others. I would encourage you to practice by making a social network site link Linkedin.com, myspace or facebook first. These are starter projects with all the tools you need already provided.

Think About The Niche – Choose The Type Of Seed

Planting your website

Once you’ve had a little bit of practice, take a good long week thinking about what kind of output you want from your site. Just something pretty to look at, an annual website, like for a festival or event, that has it’s season and is done with?Or do you have something in mind that will be perennial and will have interest again and again.

Are you looking for something that’s just for looks or do you want something that will produce for you?

When you’ve decided the type of website to plant, lay some ground work out first on paper for you to refer back to so you can keep track of your progress.

Preparing the soil is very important and be sure to pick a domain name that has enough room in it for your site to grow roots and spread out in, domain names can severely restrict your sites ability to grow if chosen poorly.

A lot of websites whither under the glaring sun of Google, so make sure your site is prepared to handle the scrutiny. There is always a temptation to try out “secret tips and tricks” that are supposed to make your plant grow unnaturally fast, but these almost always have side effects and can kill your website.

Get Your Site Hosting – Choose Your Soil

When you know if you want a perennial or annual site, choosing hosting is much easier. If it’s cheap and easy then so much the better for your annual site. But if you’re going long term, then consider slightly more expensive hosting, because it will give you a better structure for your site to grow into.

Helpful Hosting Guide InfographicDifferent-Types-of-Web-Hosting-HD

Different-Types-of-Web-Hosting-HD Different types of hosting

Now that you’ve laid your ground work it’s time to dig into that soil and plant your site. There are two basic approaches to this process, either dig in and plant a massive bulb, that already has enough nutrients to burst forth quickly, or start with a smaller site that may need time to germinate.

Naturally, your site won’t be basking in the warm glow of Google for a while, as it germinates, takes root and sends up it’s sprouts. It’s easy to get over anxious in this waiting stage and you will find yourself religiously checking it’s beginning placement in Google.

Take a deep breath, and get to work on adding the best fertilizer on the market, good ol’ fashioned backlinks.

Links: Bullshit Makes Great Fertilizer?

You can use many different kinds of links to fertilize your site. These will allow your roots to spread out evenly and their link love will get absorbed into your sites overall quality. Beware, because a lot of links aren’t fertilizer, they’re bull sh*t that you’ll have to pay an arm and a leg for and will have no benefit for your site whatsoever.

Choose quality inbound links, from social networks, other local website growers, or from directories. The best being a combination of all three kinds of links, these will help your site flourish under Google.

Once your site is sprouting, it will be up to you if you want to pay for miracle grow, because pay per click can be expensive over time. You can also use way to much of it and waste your investment very quickly.

Tend to your site, check it’s roots and you’ll most likely want to encourage branches on your site. These branches may come off on different topics, and will give you more area for Google to spread it’s loving glow over. Be mindful to trim and prune over time, because your branches may get too heavy and cause your site to topple over.

Sometimes, you can graft on new branches that weren’t there initially. Be careful when you do this that you aren’t stealing other peoples branches. This practice tends to fail as the branches have little to do with your original site and will wither under Google’s Golden Glare.

A healthy website with evergreen content will yield fruits for years to come, it’s like a garden basically. Tadeusz Szewczyk

The more time you spend pouring your energy into the site, the more it will be watered and continue to grow. Once you energy mixes with the chlorophyll in your website and the warm glow of Google, you will see the magic of Websythnesis. This combining of the site and your time will begin to produce fruit or flowers depending on your original choice. Keep the site fertilized, watered and it will flourish.

So… are you ready to get your hands dirty and start planting?

That’s not SEO. That’s just dumb.

I’ve heard a lot of things over the years as I’ve worked in the unique field of SEO. Ours is an industry without a university, or valid certification program. Where sheisters claim to have the cure-all for everything, and relatives of business owners chime in with all kinds of “helpful advice”. So I sent around some Twitter DMs to other SEOs I’ve run into over the years to ask them one simple question:

What’s the dumbest thing you’ve heard someone claim was a valid SEO strategy?

Who:  Tadeusz Szewczyk   Twitter: @onreact_com Keyword Anchor Text Link: SEO 2.0

After the redesign (that ignored most of my advice) the agency exec says “now we can add some content in the footer for the SEO”.

Who: Callis Twitter: @callis1987  Keyword Anchor Text Link: Further Strategic Digital Marketing

Just last month I was involved in a respected and renowned weekly SEO twitter chat. The first two questions were along the lines of “what can I do to avoid common indexation issues?” and what “what can I do to avoid common content duplication issues?” At least 3 people for each question answered “Use schema markup”… Schema has plenty of uses of course, but if your site isn’t getting indexed correctly or has content duplication issues no amount of schema is going to fix that. It’s the new buzz word.

Who: Tanner Petroff    Twitter: @tannerpetroff Keyword Anchor Text Link: SEO 2.0

[Client said that] “I’m spending $5k/mo on AdWords so I get better natural search placement, too.”

Who: Kaltronis   Reddit: /r/BigSEO

Automotive SEO has not improved much from 2009. We still have vendors claiming you don’t need responsive websites, pagespeed is of no concern, duplicate content across hundreds of sites is par for the course, and no SEO should cost more than $1200 a month. It’s a crazy isolated space that spends all the money, and gets the finest techniques of five years ago. *There ARE a few good companies, but they cost a premium (Just like every vertical)*

Who: Jason Brown  Twitter: @keyserholiday Keyword Anchor Text Link: Orange County SEO

[I was told] that clicks on pages matter more than any other SEO process. The site owner had each employee clicked though 50k pages a day. No joke.

Who: Steven Macdonald  Twitter: @StevenMacd0nald  Keyword Anchor Text Link: Conversion Rate Crash Course

An SEO guy to my client said “You only have 200 pages. That’s not enough pages to rank in Google!”

Who: A.J. Ghergich   Twitter: @SEO Keyword Anchor Text Link: Content Marketing Agency

Here is a dumb thing I actually see over and over. You have a brand new site focusing all their “SEO” efforts into on-page optimization and site architecture. They obsess over title tags and the perfect H1, flawless canonicalization etc. etc.

6 months later they label SEO as snake oil and move their focus into direct sales channels. On-page optimization is vital…but only if you are earning the links that make Google trust those optimizations.

A new site should really focus 80% of their efforts on link earning not keyword tweaking.

Who: Matt O’Connor   Twitter: @OC2015

That you needed to put keywords in white text, on a white background on relevant pages that needed to rank highly.

Who: Don Rhoades  Twitter: @TheGonzoSEO Keyword Anchor Text Link: Search Marketing Savant

I’d have to say “just create great content” meaning: you don’t (shouldn’t have to.. whatever) do any link outreach to your “great content”. I think Paul May crushed dem skulls of that crowd when he wrote about fairy dust.

Who: Wayne Barker Twitter: @wayneb77 Keyword Anchor Text Link: Boom Online Marketing

I guess we are talking tactics over strategy here. It’s often a case that I may infer what has been said by the actions that the unnamed take part in. A common one is that blogging is a solid tactic in isolation from other work. We are currently combining, rewriting, deleting and editing the blog of a client whose last company pumped out average post after average post. Frustrating.

Another is quite prevalent at the moment and that is going big wins everything. Going big can be good if you are really confident that you have tapped into something.

There are just loads and loads of them. “If you are doing PR well you don’t need an SEO”, “You don’t need technical SEO”.

If someone says something can be done in isolation and without tapping into other channels, or without other people involved is BS.”we can do your SEO without input from client or their staff” is a bullshit strategy.

Who: Zak Nicola   Twitter: @ZakNicola

A guy thought that it would be smart to take a car dealership website and break it up into five URLs.
One URL for new cars one for used cars one for service and one for tires and one just for the dealership info. The top navigation would link out to the different URLs and would swap it out depending on which you are all you were on.

His idea was that having five different websites would gather more traffic and that the linking between them would help increase rankings. He basically wanted to take all the pillars of relevant content that are under the category of being a dealership and break them up into individual sites and linking out it just was a bad idea. This was before the EMD update, and it had some legs for about 6 months. But he held to this tactic after the update even in the face of being sandboxed across thousands of dealership websites.

So heard anything stupid recently? Add a comment or DM me on Twitter to get your answer added to my list.

 

Musings on Service Area vs Location

So I have this Nashville roofing client( Hell yea that’s a followed anchor text link to them. Not sure when we got so scared to link out! But that’s another rant for another day.)

Annnnyyyyways, they do roofing for Nashville and it’s surrounding cities, but they happen to have chosen an office located in Hendersonville, a small suburb North of Nashville.

Of course, when we connected he wasn’t ranking organically for any Nashville area related keywords or phrases in the first page. Nor was he ranking in maps, outside of Hendersonville.

He was more and more worried that given his competitors closer locality to the centroid he was never going to be able to rank for Nashville queries in Maps.

 My View On Service Areas

Here’s what I suggested we do. Change his location to a service area, improve his website, build local links AND cleanup and expand his citations at the same time. Once we started surfacing organically for Nashville queries, then we would look to see if our Map rankings followed suit. If not, then we would look into opening a new office in Nashville proper, and build out the citations for the new locale.

What would you have done?

My View On Brick & Mortar

Another person I recently talked to was Jeremy who runs a Nashville residential & commercial cleaning company called The Cleaning Executives. He DOES have a brick and mortar office location, but really does provide services at his client’s location. So in this situation, I recommended we check for his citations…of which he had 3 or 4. So we checked out the top ranking competitors with Bright Local’s citation tool  and came up with a game plan to get 2x as many citations as the top competitor.

Nashville SEO “Group Therapy” Session #2

Depressed about your website’s current lack of rankings and traffic?

Frustrated with your conversion rates?

Are you feeling sad, lonely and confused about SEO for your business? Why not get together and talk it out.

Join me to talk out your current SEO frustrations in a friendly group setting. I’m more than happy to chime in with my own suggestions and advice and give you the opportunity to hear from other Nashville small business owners who might be struggling with the same issues. So come join me, and whoever else shows up for some FREE SEO CONSULTING. 

Where:  Thistlestop Cafe  – West Nashville
5128 Charlotte Ave, Nashville, TN 37209

When: Wednesday, 9am, CST

What: A friendly meetup to talk about SEO

How much: Free!

Nashville SEO “Group Therapy” Session

Are you feeling sad, lonely and confused about SEO for your business? Join me at Slow Hand Coffee on Thursday, at 1:00pm to talk it out in a friendly group conversation. I’m more than happy to chime in with my own suggestions and advice and give you the opportunity to hear from other Nashville small business owners who might be struggling with the same issues.

Where:  Slow Hand Coffee Shop – Downtown Nashville

When: Thursday, 1pm, CST

What: A friendly meetup to talk about SEO

How much: Free!

Properly Leverage Tags in WordPress for SEO Glory

Tags can lead to useless stub pages because each time you make a new tag, it makes a new page. If you’re not careful then you just endup with pages where that same single post appears on 5-10 almost completely duplicate pages. This is bad for users and bad for Google, since they’ll have to crawl that page and disregard that page as having no unique value.

This leads to the most common suggestion: Block them via Robots.txt

Blocking Isn’t ALWAYS The Solution

There’s times however when blocking those tag pages is missing out on creating useful, relevant pages that users and Google will find valuable!

When To Embrace Tags

If your Analytics show decent levels of traffic to certain tag pages, then you might have a good reason to unblock those pages.

If your categories can’t capture a very useful aspect of the content that your users want to find then use those tabs as a secondary method to surface that information for Google and users.

How to Optimize Your Tags

  • Limit their numbers. Think out specific use and don’t just throw them around. The closer you can get to just 1 tag, the less chance of making useless stub pages.
  • Add unique content and call to action to those pages. Turn them into worthwhile landing pages for your site!
  • Update those default Yoast meta titles to remove “Archives”, YUCK!

My Client Example:


One of my clients does property management, and their blog was a mess with over a dozen tags on EVERY post and 2-3 categories too! We initially were going to dump them but realized that using tags for highlighting the City & State was useful for users and search engines too!

Finally A Tool To Fight Google Analytics Spam

Fighting Referral Spam in Google Analytics

[Jeremy Says: I’ve been struggling with GA referral spam for a few months and was hitting a creative wall. Fortunately others ran into the problem too and found a solution. I invited Jeff Siebach to tell you more about the problem and what he & his company came up with as a solution. Enjoy!]

Jeff Siebach says:

Over the last 6 months, you probably noticed a big increase in your reported referral traffic in Google Analytics. “Yes! I’m finally getting some visitors to my website!” you exclaimed happily.

Then you clicked to see where the traffic came from and your emotions changed from excitement to confusion. All of the traffic appears to be from ‘free-social-buttons.com’ and ‘Get-Traffic-Now.com’.

But why did those sites start sending you traffic, you ask?

That Referral Traffic Spike is Bogus

The answer is: they didn’t. What you’re seeing is a new form of spam that is sweeping Google Analytics accounts around the world. Referral spam, also known as “ghost referrals” is the newest way advertisers have found to promote their websites to website owners.

By faking traffic to your website, these spammers create mounds of referral traffic in your Analytics account in the hopes that you’ll see their website name and visit their site.

Here’s an example of what this may look like in your GA account:

Example of Google Analytics Spam

 

So these bots are visiting my website?

Not exactly. When you add Google Analytics tracking code to your website, you use a unique “property ID” to tag your Google Analytics account that looks something like “UA-123456-78”. When visitors appear at your site, a small piece of javascript code sends Google a notification with that property ID to let them know that someone has viewed a particular page.

These bots crawl through the internet daily hunting down new property IDs that they haven’t found before. Then, once they’ve locked on to your ID number, they send notification after notification to Google stating that your page has been viewed.

They are not actually sending traffic to your website, just sending Google notifications that your website has been viewed and the source of the page view was (insert spam website here). Google can’t tell whether the page view actually happened on your domain or not – they just record the information they receive and report it back to you in your GA dashboard.

Well if they aren’t visiting my website, what’s the problem?

The unfortunate result of all of this spam is that your Google Analytics data is completely skewed because of all of the fake visits. These visits are often characterized by 0:00 time on site and a 100% bounce rate. They never convert on your conversion goals, reducing your conversion rate numbers, and they ramp up your referral traffic numbers, masking the important information you need to know about how people are getting to your site.

If your website has a lot of traffic, this may not be impacting you too greatly, only accounting for 5-10% of your traffic. But for most of us, who are working to drive traffic to a blog, or a new business, this spam may be responsible for 80% (or more!) of your reported visits.

GA Spam...You just stepped in it!

Ok, now I’m mad! How do I get rid of Google Analytics Spam?

Blocking this traffic from appearing in your Google Analytics account can be done using GA filters. You can eliminate this traffic from your reports by using “Exclusion Filters” which can be found in the Admin -> Account -> Filters section of Google Analytics.

To block one of these sites, you’d create a filter with the following settings:

Here's how you add a filter to blog GA spam

Save your filter, link it to the views that you want it to apply to, and voila! That site will no longer appear in your referral traffic moving forward.

One filter down, 200+ sites to go!

Blocking out all of the spam sites individually is a daunting task. Every day, new spam sites are being created and manually adding these filters is like an endless game of whack-a-mole.

No matter how thorough you are blocking all of the spam sites, new ones will appear the next day – and you may have ten or even hundreds of GA accounts you need to filter.

An automated solution

In order to combat this spam for good, we at Anchor Metrics created a tool called the GA Spam Fighter.  We’re working with the team at SpamScape.net to maintain an up-to-the-day list of all of the known spam sites that have been plaguing Google Analytics.   Then, we created a one-click solution to add filters to a GA profile that will block the entire list of sites at the same time.  

To try it out, visit http://gaspamfighter.com.

First, you will be asked to link up your Google Analytics account and select the profile you want to filter. Click “Add Filters” and we will generate a handful of filters that exclude traffic from the newest list of spam websites (currently almost 300 domains long!).

These filters don’t remove traffic retroactively, but from that day forward these sites will no longer appear in your reports. [Jeremy Says: You’ll need some custom Segments for that]

A week later, when you’ve noticed some new spam sites appearing, make your way back to the GA Spam Fighter and run the filters again. We’ll automatically locate the original filters we made, identify any new spam sites that you aren’t blocking yet, and create another filter for the new sites.

Think of it like updating the anti-virus software on your computer.

Sounds great!  How can I help?

If you’d like to help us maintain the software, head on over to SpamScape.net and anonymously link your Google Analytics profiles to their tool.  

They’ll peek at your Analytics data daily to identify new spam sites and include them in the “blacklist”.  Since the GA Spam Fighter pulls sites from that blacklist, you’ll be helping the tool keep up-to-date on the latest thing spammers are doing to ruin your day.

 Together we can defeat these spammers once and for all.

Who Came Up With This?
Anchor Metrics develops tools for marketers to help them better track and report on digital marketing campaigns.  Their digital marketing dashboard pulls data about your website traffic, ad campaigns, online reviews, social media profiles, and more into one place, letting you view and export reports for all of your campaigns.  They also allow you to white-label a portal so your clients can log in and view their reports 24/7. 

A Breakdown of 10 SEO Site Crawl Tools

Beam Us Up

• User Interface: Desktop Application
• Platforms Supported: Windows, Macintosh – Uses Java – Be sure to update to latest version.
• Cost: Free
• Crawl Scope: As many pages as you want (It uses local memory so it just takes time to crawl big sites.) Of course, you may regret trying to open an excel file with 500,000 entries so crawl responsibly
• Resource Article: A Simple DIY Audit

Screaming Frog
• User Interface: Desktop Application
• Platforms Supported: Windows, Macintosh, Ubuntu
• Cost: Free version crawls 500 URLs, $99 yearly license
• Crawl Scope: 500 Pages, Site wide
• Resource Article: Screaming Frog SEO Spider Guide

Raven Site Auditor and Site Performance :
• Company/Parent Toolset: Raven Tools
• User Interface: Web Application, Browser Add-on
• Cost: Paid $99 and $249 a month (Part of overall tools set), Free 30 day trial
• Crawl Depth:
• Unique Function:
• Resource Article:

SEOMoz Crawl diagnostic:
• Company/Parent Toolset:
• User Interface: Web Application
• Cost: $99 a month (part of full SEOmoz subscription),
Free 30 day trial
• Crawl Depth:
• Unique Functions:
• Resource Article:

Craawler
Nifty tool created by Tom Johns does a full site crawl picking out site errors, spelling errors and giving tweets, likes and +1’s page by page.
• Company/Parent Toolset:
• User Interface: Website
• Cost: Free
• Crawl Depth: Site Wide • Unique Function: Checks spelling and provides tweets and +1s on a page by page view
• Resource Article:

A1 WebSite Analyzer
• Company/Parent Toolset: Microsys
• User Interface: Desktop Download
• Platforms Supported:
• Cost: $69 30 day trial
• User Interface: Website
• Cost: Free
• Crawl Depth: Sitewide
• Unique Function: Provides graph breakdowns of page objects and header responses.
• Response from tool creator:

A1 Website Analyzer is a Windows and Mac website and link analysis tool.
After a site crawl, it can filter and show you all pages (including broken links
to internal and external URLs), file sizes, response codes, duplicate content,
titles, descriptions, H1, H2 canonical, internal anchor text to pages, link
line numbers, link follow/nofollow states, navigation click length, all top
keywords and a ton of other data. It will also calculate internal link juice
“score” of all pages. (The tool has been in constant development since 2006.)

If configured so, it can also validate HTML/CSS, spell check and
perform custom search for text/patterns throughout entire website
and integrate with various online tools.

There is no fixed limit of URLs (or links for that matter), and the
practical limit is well above 100,000 unique URLs – infact some have
reported success crawling many-million-page websites, but it depends
on factors such as the computer, website and program configuration.

For those with complex websites, the tool features an extensive
set of website crawl “limit to” and “exclude” filtering options.
It is also possible to add multiple “start crawl from” paths.

The program is shareware with a fully functional 30 days free
trial, but if you need a licensed review copy with no strings attached
before testing/deciding, I will be happy to send one of course.

The license is not subscription based and not per-month or per-year.
For those who buy, it is a one-time price at $69 for version 7.x
including a free 8.x upgrade if released within one year of purchase
with no built-in restraints on how long you can keep using the software.

For a full run-through and guide (including video) of all the features and uses, see:
http://www.microsystools.com/products/website-analyzer/help/site-analysis-seo-audit/

P.S:
It is worth noting that A1 Website Analyzer also has five sibling
tools – one of them being A1 Sitemap Generator, which can create
XML sitemaps, video sitemaps, image sitemaps, HTML/CSS sitemap etc.

But here’s a link to a resource article going over most of the features
http://www.microsystools.com/products/website-analyzer/help/site-analysis-seo-audit/

w3c Link Checker
• Company/Parent Toolset: Mozilla Foundation
• User Interface: Website
• Cost:
• Crawl Depth:
• Unique Function:
• Resource Article:

II S 7 – seo toolkit

• Company/Parent Toolset: Microsoft
• User Interface: Desktop Download
• Platforms Supported: Windows
• Cost:
• Crawl Depth:
• Unique Function:
• Resource Article:

Sucuri

• User Interface: Website and Web Application
• Cost: Free scan and $89.99, $189.99, $289.99
• Crawl Depth: Site wide
• Unique Function: Malware identification and cleanup • Recommended Resource Article:

Web Data Extractor

• Platforms Supported: Windows 95/98/2000/NT/ME/XP/Vista/7
• User Interface: Desktop Download
• Cost:
• Crawl Depth: Sitewide • Unique Function:
• Resource Article:

Quotagraphic

 

I used Piktochart, changed the file extension from .GIF to .PNG, added the text, published it and then embedded it.

Does This Image by Buffer’s Pablo Look Pixelated to You?

I love Pablo by Buffer. But when I made this graphic for one of my client’s sites a coworker thought it looked “pixelated”. Here’s the offending image.

So..Is What’s The Deal?

Hmm…it does kinda look wibbly wobbly around those edges.

Thinking about it, I was just rolling with the default font with is Merriweather.

So I thought I’d output the same image in all the fonts to see if it’s the image output or the font choice. Let’s see if they’re all pixelated or just that one:

Fredoka-One

Fredoka-One

Hammersmith

Hammersmith

Josefin Slab

Josefin Slab

Lato

Lato

Merriweather - The Original Offender

Merriweather – The OG: Does this font make my image look pixelated?

Montserrat

Montserrat

Open Sans

Open Sans

Roboto

Roboto

Satisfy

Satisfy

Ubuntu

Ubuntu

Other Small Pablo Complaints

What’s Your Judgement?

Are they all pixelated or is it just Merriweather?

« Older posts

© 2016 Jeremy Rivera SEO

Theme by Anders NorenUp ↑