Wednesday, May 30, 2018

5 Important Features of an SEO-Friendly Web Host

SEO-Friendly Hosting: 5 Things to Look For in a Hosting Company

As SEO professionals, we have no shortage of things to worry about.
There are the old standbys: backlinks, content creation, sitemaps and robots.txt files.
And there’s new(er) stuff to get excited about as well: voice search, featured snippets, the mobile-first index.
Amidst the noise, one factor often goes overlooked, even though it can impact your site’s uptime and your page speed – both of which are essential elements for maintaining positive organic performance.
I’m talking about web hosting, folks.
The web host you choose determines the overall consistency of the site experience you offer organic visitors (and all visitors, for that matter).
If you want to prevent server errors and page timeouts – and stop users from bouncing back to Google – you need a solid web host you can rely on.
Ultimately, you want a web host that supports your organic efforts, rather than impeding them. Let’s look at five key features that define an SEO-friendly web hosting company.

1. High Uptime Guarantee

Your host’s uptime guarantee is arguably the most important factor in whether they’re SEO-friendly.
Uptime refers to the percentage of the time your site is online and accessible. The higher your uptime, the less likely visitors will visit your site only to discover it’s down, sending them back to the search engines and potentially risking your rankings in the process.
Better, more reliable hosts offer higher uptime guarantees.
For best results, choose a host with at least 99.9 percent uptime guarantee (or higher, if you can get it). That translates to roughly 1.44 minutes of downtime a day and 8.8 hours per year. Not bad.
However, be wary of any host that claims 100 percent uptime. There’s always going to be some downtime. The key is to keep it as short as possible. That way, it won’t affect your SEO performance.

2. Server Location

While uptime refers to your site content being accessible to users, your server location may dictate how quickly it’s accessible to them.
If you’re on a shared, VPS, or dedicated server hosting plan, your site lives on a physical server in a data center somewhere (as opposed to cloud hosting, where your data is housed in the cloud).
Ideally, you want that data center located as close as possible to the majority of your site visitors. The farther away your server is, the longer it can take for your site to load.
Server location can also look fishy to search engines, which may affect your SEO. If you operate in one country but use a host located halfway around the world, there may be something nefarious going on
It goes without saying that servers themselves should also be fast, and that the host should further boost performance through a Content Delivery Network (CDN).

3. Multiple Options

We all like options. You should enjoy them with your web hosting, too.
Beyond hosting itself, many hosting companies offer optional value-adds that can upgrade your site. Here are some of the SEO-friendly ones you’ll want to see:
  • Automatic backups: If something ever goes wrong, you want a site backup you can quickly restore from. See if your host offers automatic backups for free or for an added cost.
  • SSL: HTTPS has been a ranking factor for years now. If you haven’t already transitioned to a secure site, you need to get your act together. Make sure your host supports SSL. Some even include them for free with your hosting package.
  • Multiple hosting plans: As your site grows, your hosting needs are likely to change (this is a good thing!). Eventually, your traffic numbers may be big enough to warrant switching to your own dedicated server. This transition will be easier (and cheaper) if you don’t have to switch hosting providers at the same time.

4. Good Reviews

Alright, let’s say you’re actually using this list to compare hosts. By this point, you’ve read through their hosting features, and it appears they’re checking off all the right things.
Now it’s time to validate that the marketing claims are true. Before you sign up with a host, take a few minutes to read their online reviews.
A caveat: The hosting space tends to attract more unhappy reviews than most.
If a barista messes up your coffee, you’re unlikely to be bothered enough to write a scathing review for the cafe on Yelp.
But if your site goes down, even for a moment, or even if you were at fault (as can happen if you choose an inappropriate hosting plan for your traffic needs), you are going to be extremely angry with your host and tweet, post, and blog about it loudly and vociferously.
Unfortunately, that’s just the nature of the business.
Having said that, you can still gather a lot of valuable information from reviews. Look for hosts that appear again and again on Top Web Hosts lists, and read the reviews to verify that the hosting plan you’re considering is likely to give you what you need.
You won’t have trouble finding these lists. A quick Google search for [best web hosting] delivered a slew of results from PCMag, CNET, and more:
google search for best web hosting reviews

5. Responsive Support Team

While you’re reading through the reviews, pay special attention to how people talk about their support.
In the unlikely event that your site does go down, you want to be able to fix it immediately. Most often, that will involve speaking to a support person.
A good host will offer 24/7 support for free. Verify the operating hours of your potential host’s support team, and see how exactly you’ll be able to get in touch with them. Is there a phone number, live chat, or email?
Check out their social profiles, too. Web hosts who care about helping their customers tend to make customer support widely available on social media, perhaps even via dedicated support Twitter accounts.
Here’s an example from Squarespace:
squarespace customer support twitter

Bonus: Easy-to-Use CMS

This one’s not exactly related to hosting, but it’s important nonetheless. Being able to easily create outstanding content is key for your SEO success. You know that.
So, you want a host that integrates with a CMS you’re either already familiar with or you can easily learn. Otherwise, you’re just making things hard on yourself!
Fortunately, most hosts today offer their own drag-and-drop content creation tools. Many also integrate with WordPress and other popular content management systems.

What Defines an SEO-Friendly Web Host?

Good, reliable web hosting is one of those things that runs in the background without you ever having to think about it. That, in essence, is an SEO-friendly web host.

Reference:https://www.searchenginejournal.com/seo-friendly-hosting/251817/?ver=251817X2

Tuesday, May 29, 2018

Google: Multiple Sites on Same IP is Not a Problem, But Similar Content Is

Google: Multiple Sites on Same IP is Not a Problem, But Similar Content Is

Google’s John Mueller recently clarified that having multiple sites on the same IP address is not inherently a problem.
However, it may become a problem if those sites have very similar content.
This topic came up during a webmaster central office hours hangout. An individual was concerned about a decline in traffic amongst multiple sites on the same IP address.
In addition to being on the same IP address, the webmaster described these sites as having very similar content and site structure.
Mueller explained that having multiple sites on the same IP address is not a problem as far as Google search is concerned. That’s how the internet generally works, Mueller says.
A bigger issue is the similar content, as Google may regard that as being a collection of doorway sites — especially if the sites are funnelling users toward the same products.
Google may end up doing one of two things in these cases:
  • One: Google may decide to display content from just one site out of the collection of “doorway” sites.
  • Two: If Google regards a collection of similar content sites as being “doorways” then it might demote all of them.
Mueller ultimately recommended that this webmaster focus on making the content more unique, rather than being concerned about them all being on the same IP address.
You can hear Mueller’s full response in the video below, starting at the 20:33 mark.


“All the same IP address — that’s really not a problem for us. It’s really common for sites to be on the same IP address. That’s kind of the way the internet works. A lot of CDNs (content delivery networks) use the same IP address as well for different sites, and that’s also perfectly fine.
I think the bigger issue that he might be running into is that all these sites are very similar. So, from our point of view, our algorithms might look at that and say “this is kind of a collection of doorway sites” — in that essentially they’re being funnelled toward the same product.
The content on the sites is probably very similar. Then, from our point of view, what might happen is we will say we’ll pick one of these pages and index that and show that in the search results. That might be one variation that we could look at. In practice that wouldn’t be so problematic because one of these sites would be showing up in the search results.
On the other hand, our algorithm might also be looking at this and saying this is clearly someone trying to overdo things with a collection of doorway sites and we’ll demote all of them.
So what I recommend doing here is really trying to take a step back and focus on fewer sites and making those really strong, and really good and unique. So that they have unique content, unique products that they’re selling. So then you don’t have this collection of a lot of different sites that are essentially doing the same thing.”

Reference:https://www.searchenginejournal.com/google-multiple-sites-ip-not-problem-similar-content/254799/

Monday, May 28, 2018

Four Research Papers About CTR and Ranking

Four Research Papers About CTR and Ranking

There have been many discussions about CTR and rankings. Some say CTR is a ranking factor, others insist it’s a part of machine learning and quality control. A third group claims it’s all three plus a bag of chips. Regardless of which camp you pitch your tent at, here are four research papers I believe are helpful for understanding the role of CTR in search engine rankings and SEO.

Thorsten Joachims and the Study of CTR

Thorsten Joachims is a researcher associated with Cornell University. He has produced many influential research papers, among them research on the use of Click Through Rate for the purposes of search engine algorithms. If you are interested in understanding the possible roles of CTR in search engines then these four research papers authored by Thorsten Joachims will prove enlightening.

1. Optimizing Search Engines with CTR

Optimizing Search Engines Using Clickthrough Data (PDF) is a research paper from 2002. This research paper introduced the concept of using the CTR data as indicators of how relevant search results links are and to use that information to rank better web pages.
That this research paper is from 2002 shows just how old research into CTR is. Studying CTR for relevance information is a mature area of research. Search engine related research has progressed far beyond this area.
Nevertheless, it’s important gain an understanding of CTR to gain a foundation of understanding. Once you have a foundation of understanding, you will be less likely to be fooled by baseless speculation about click through rates and it’s role in ranking web pages.
Here’s what the research paper states:
“The goal of this paper is to develop a method that utilizes clickthrough data for training, namely the query-log of the search engine in connection with the log of links the users clicked on in the presented ranking.
…The key insight is that such clickthrough data can provide training data in the form of relative preferences.”
In my opinion, this paper recognizes limitations in the algorithm. The algorithms are limited to learning which of the top ten links are most relevant. But it learns nothing about the web pages in second and third or fourth pages of the search engine results pages (SERPs).
This is what the research paper observes:
“…there is a dependence between the links presented to the user, and those for which the system receives feedback.”
Right from the beginnig of CTR research it was understood that CTR data from the top ten of the SERPs was of limited but important value. The research paper also notes that using this kind of algorithm was open to spamming and that steps would need to be taken to make it immune to spamming.
This is what Thorsten Joachims noted:
“..it might also be possible to explore mechanisms that make the algorithm robust against “spamming”. It is currently not clear in how far a single user could maliciously influence the ranking function by repeatedly clicking on particular links.”
This is important information because it shows that even in 2002 researchers were thinking about how to prevent click spamming. What this means is that the advice to click on one’s own listing to upvote their own sites probably doesn’t work.

2. The Concept of CTR as Biased Feedback

This paper, authored with a researcher from Stanford University is entitled, Accurately Interpreting Clickthrough Data as Implicit Feedback – 2005 (PDF) . This is an important research paper because it introduces the concept that maybe CTR data is not that reliable.
Here is how the CTR research paper expresses the idea that CTR data is noisy:
“This paper examines the reliability of implicit feedback generated from clickthrough data in WWW search. Analyzing the users’ decision process using eyetracking and comparing implicit feedback against manual relevance judgments, we conclude that clicks are informative but biased. While this makes the interpretation of clicks as absolute relevance judgments difficult, we show that relative preferences derived from clicks are reasonably accurate on average.”
This paper is concerned with understanding which links users have scanned, if users scan from top to bottom, which links do users linger over before clicking and how the title and meta description in the SERPs influence user decisions to click one link over another. And this, that the title and meta description influenced the users behavior, is the bias that this research paper discovered.
Yet the paper was optimistic that because there is a large amount of data to be mined, machine learning could be applied in order to reach accurate determinations of what links are more relevant than other links.
The research paper on CTR reached this conclusion:
“Our results indicate that user’s clicking decisions are influenced by the relevance of the results, but that they are biased by the trust they have in the retrieval function, and by the overall quality of the result set. This makes it difficult to interpret clicks as absolute feedback.
However, we examine several strategies for generating relative feedback signals from clicks, which are shown to correspond well with explicit judgments. …The fact that implicit feedback from clicks is readily available in virtually unlimited quantity might more than overcome this quality gap, if implicit feedback is properly interpreted using machine learning methods…”
I believe it is important to note that this research paper is not concerned with finding spam or with finding low quality sites to exclude. It is simply concerned with finding relevant sites that satisfy users.

3. Machine Learning and Simulated CTR

The third research paper is also from 2005. This paper is titled: Evaluating the Robustness of Learning from Implicit Feedback . The goal of this paper is to understand when CTR data is useful and when CTR data is biased and less useful.
This is how the paper framed the problem and the solution:
“…this data tends to be noisy and biased… In this paper, we consider a method for learning from implicit feedback and use modeling to understand when it is effective.”
This paper is especially interesting because it introduces the possibility of modeling user behavior and using that data instead of actual user behavior. This paper also mentions reinforcement learning, which is machine learning. Here’s a link to an introduction to reinforcement learning.  It uses the example of a child learning that a fire is good because it gives off heat. But later learns the fire is bad if you get to close.
This is how the research paper presented it:
“This type of interactive learning requires that we either run systems with real users, or build simulations to evaluate algorithm performance.
The alternative, often used in reinforcement learning, is to build a simulation environment. Obviously this has the drawback that it is merely a simulation, but it also has significant advantages. It allows more rapid testing of algorithms than by relying on user participation. It also allows exploration of the parameters of user behavior. In particular, we can use a model to explore the robustness of a learning algorithm to noise in the training data.”
This is really cool. It shows how a search engine can use machine learning to understand user behavior and then train the algorithm without actual CTR data but with simulated CTR.
This means that a search engine can theoretically model user behavior on web pages even if those pages do not rank on the first page of the SERPs. This overcomes the limitations noted in the research way back in 2002.

4. User Intent and CTR – 2008

The last research paper I want to introduce to you is, Learning Diverse Rankings with Multi-Armed Bandits (PDF). This research paper does not use the phrase user intent. It uses the phrase, user satisfaction.
This paper is focused on the importance of showing results that satisfy the most users. And satisfying the most users means understanding what clicks results in the least amount of clicks back to the search engine, also known as abandonment.
Satisfying all users means showing different kinds of web pages. The user intent for many search queries is different. So what’s relevant for one user is less relevant to another. Thus, it’s important to show diverse search results, not the same kind of answer ten times.
Here’s what the paper says about showing multiple kinds of results:
…user studies have shown that diversity at high ranks is often preferred. We present two online learning algorithms that directly learn a diverse ranking of documents based on users’ clicking behavior. We show that these algorithms minimize abandonment, or alternatively, maximize the probability that a relevant document is found in the top k positions of a ranking.
And this is what the papers says about User Satisfaction:
“…previous algorithms for learning to rank have considered the relevance of each document independently of other documents. In fact, recent work has shown that these measures do not necessarily correlate with user satisfaction…”
And here is the part that really nails the problem that search engines today have solved:
“…web queries often have different meanings for different users… suggesting that a ranking with diverse documents may be preferable.”
The only downside to this kind of CTR algorithm for determining user satisfaction is that it may not work well for topics where what users want is in a state of change.
“We expect such an algorithm to perform best when few documents are prone to radical shifts in popularity. ”

Read the CTR Research

And there you have it. These are, in my opinion, four important research papers to read before forming an opinion about the role of CTR in ranking web pages.
It’s important to note that the first research paper cited in this article is from 2002. The last one is from 2008. This gives an idea of how mature research into CTR is. Most research today is no longer focused on CTR. It is focused on artificial intelligence.
Nevertheless, if you are interested in CTR data and how it may play a role in ranking, you will benefit from reading these four research papers.

Reference:https://www.searchenginejournal.com/ctr-ranking/254631/

Friday, May 25, 2018

11 Key Parts of SEO You Need to Execute Correctly

The 11 Most Important Parts of SEO You Need to Get Right

SEO does not work in a vacuum. It requires many moving parts working in the right context in order to create a holistic, effective marketing strategy that gets results for your business.
From audience, to the user, to final conversions, SEO involves executing the right moving parts in the correct context and in the correct order to win clients, conversions, and sales for your company.
In this post, I’ll detail the 11 most important parts of SEO you need to get right in order to facilitate an effective SEO process.

1. Your Audience & Industry

Your primary industry and its audience should be the number one consideration behind any viable SEO strategy.
  • What industry are you in?
  • Who are its top competitors?
  • Where do your competitors primarily do business?
  • How are your competitors primarily executing their SEO strategy?
  • What competition is the fiercest?
These questions and more will determine your next steps in forming your SEO strategy and these various moving parts will soon come into full focus as you nail down what to do next.

2. Keyword Research

As you nail down your audience and industry norms for SEO, keyword research is necessary to pinpoint the best possible user intent to go after and find what your audience is searching for.
But, not only that, what your audience searches for is just as important as how they search for it. Subtle shifts in keyword research can make or break an SEO strategy.
And you better have a firm grasp of the norms in terms of industry market shifts, as well as buyer personas and how they impact the overall SEO strategy.

3. User Intent

SEO Factor - User Intent

User intent behind keywords is the next thing that is absolutely vital to the success of any SEO campaign.
For example, let’s say that your audience normally searches for “widgets that I want to put together” as a primary starting point.
But, throughout your keyword research, you find variations for “widgets for sale,” “DIY Widgets,” and “widgets that get things done”. Each of these variations results in at least a ten-fold increase in searches leading back to your landing page.
It would be a good idea to integrate these into the overall SEO process, now, wouldn’t it?
If you hadn’t done this keyword research and made adjustments based on market shifts in audience search behavior, you likely would not have found these deeper keywords that were worth targeting.
It’s all in how you approach how deep you want to go in keyword research. The deeper you go, the better opportunities you may end up eventually uncovering.

4. Analytics and Reporting

Let’s get real . Nothing is more important to an SEO campaign than accurate reporting.
If you can’t report on results that the campaign achieves accurately, then how can you expect to make the accurate adjustments that an SEO campaign requires?
Let’s also get real about something else. Some industries don’t require by the day or even by the week adjustments to keyword strategy. Most industries don’t even require adjustments every six months.
But, if you’re in a rapidly changing industry where the market shifts quickly, it may be important to integrate a quarterly or even bi-monthly keyword research task into your SEO process so that you know exactly what audiences are searching for next.
How does this fit into analytics reporting? When you attribute keywords and landing pages, it is easy to see exactly what keywords and landing pages are primary drivers of your SEO process execution and your overall SEO strategy.
By doing this effectively, it is possible to make adjustments properly and eventually find the next big thing in your market industry shift.
This is why it is so important to get analytics reporting correct. If your analytics reports that you get approximately 6,000 visits a month in bot traffic, have you really been successful at all?

5. Mobile SEO

The next big thing that is on everyone’s mouths right now is Google’s mobile-first index. The mobile-first index is Google’s new de facto standard for search, with a focus on mobile websites.
It is important to note that this does not exclude desktop – desktop sites will still perform in search results if they are the best result for the query. But, Google’s move to mobile-first signifies the beginning of a new era – an era of dumbed-down search results for the masses.
Just a warning and it is this author’s opinion: you may be shocked that I said that. Dumbed-down? But aren’t they supposed to be going smarter? Well, unfortunately, mobile is part of the lowest common denominator in search now.
No longer will web designers have great canvases to create amazing website designs. Everything will pretty much lean towards one standard – iPhone or Android, and you’d better make sure that everything works fine on both, or else.
Aside from my rant on the evils of mobile SEO (I do apologize), while it is unfortunate that Google has chosen to go in this direction, forsaking all that is beautiful for a few measly increases in visits, this is an important aspect of SEO to get right.
As Google’s mobile index grows beyond the first several waves, we can expect to see algorithm shifts and updates just as we did with the normal index.
Now, m-dot (m.domain.com) mobile websites are not recommended and should go the way of the dodo. Aside from major issues with duplicate content, these types of mobile sites can also introduce canonical URL issues with indexation, and many other issues.
It is recommended by this author that all sites adopt a best-practice of mobile-friendly, responsive designs moving forward.
The reasoning behind this is that it gives all versions of your site equal opportunity to get indexed, and remain competitive in the coming mobile-first index.

6. Crawling

Crawling and Indexing

Crawling is the process by which search engine spiders discover your site.
If your website architecture is out-of-whack, or your internal linking is off, or if you don’t even have a sitemap.xml file (shame on you), it will be difficult for search engines to crawl your site.
In addition, major issues with 404 errors on the site can hurt crawling and indexing as well.
Other issues include technical implementations that prevent spiders from crawling the site. One obvious solution for most SEO pros is making sure the following line is removed from robots.txt:
Disallow: /
This is entirely different from the “disallow: ” directive. While the difference is subtle, the meaning behind both can mean the difference between successful crawling and indexing of your site.
The first directive tells your server to disallow all search engines from crawling your site.
The second will allow all robots complete access.
Very different, right?
This is why it is so important to make sure that your site is 100 percent functional, and crawlable from the start.
Don’t wait until an audit reveals that you missed a step during the setup of your new site. That will make you look silly.

7. Indexing

Indexing is completely different from crawling. While these two actions are related, they are not mutually exclusive.
Doing things like noindex,nofollow ineffectively will impact indexing in a negative way. This is just like doing something silly like not including a sitemap.
While you can get around not having a sitemap by using Fetch as Google in Google Search Console, it is simply much easier and more efficient to create a sitemap and submit that to Fetch as Google.
One more massively bad situation includes canonicalizing your pages, but not identifying trailing slash issues. These types of issues can lead to indexing double or triple the amount of pages that your site actually has, which interferes with Google’s ranking algorithms.
When you don’t get indexing right, you can leave entire sections of your site not being taken into account when it’s crawled. This can also lead to major ranking performance issues as a result.
This is why it’s so important to perform an in-depth website audit that takes into account these things. Because when doing so, you can uncover issues that may not have otherwise been taken into consideration. And these issues can make a big difference in your website’s performance.

8. Technical SEO

Technical SEO

Site speed. Coding. JavaScript. Schema markup. Schema JSON-LD. Canonicalization.
Technical SEO almost always brings to mind these and other related terms.
When something on the technical side of SEO is out of whack, your entire website can suffer. Here are a few examples.
Say that you have created a website that has everything right, but you have left out a small detail in a canonicalization plug-in.
Or, you have created a large homepage slider that takes 3 seconds to download just for the slider.
Or, you have created a Schema implementation where one letter was off in the business name on certain Schema markup.
This is where mistakes in an SEO implementation can get dicey, and why things are not always cut-and-dry when it comes to effective SEO projects.
As a site gets larger and infinitely more complex, one mistake in one part of the implementation really can affect another part of the SEO project.

9. Content

Content continues to be one of the de facto standards by which SEO pros acquire links to help increase rankings to help create streams of traffic to a website. It is another example where, if it is not executed properly, content can end up being the bane of a website’s (or author’s) existence.
Not all content is created equal. There’s spammy content, and non-spammy authoritative content. In general, if you manually write your content, you’re in the clear.
You don’t have to pay attention to weird things like reading levels, complexity in sentence structure, or other things to create content that ranks well. But, and this is a big but: ‘just create good content’ is a lie.
It is this author’s opinion that good content should be created in such a way that it also goes viral, that it resonates with your audience, and that it creates a lasting impact on people so much so, that they will want to buy from you.
Is it always possible to get content right from the beginning? No.
You can have all the process bits created properly. The content is executed properly, it is written well, the keyword research is solid and shows promise. But, for whatever reason, the content utterly, despicably flops. Hard.
There are things you can get right when it comes to content. Keyword targeting, on-page optimizations, meta optimizations, no typos, no grammatical errors, optimized images. But, for whatever reason, the audience dislikes the content.
In these cases, it isn’t always a cut-and-dry answer as to why the content flopped.
You can’t say that content was awful or bad when the content is written well. You can’t say that the content flopped because of arbitrary measures that are hard to quantify.
For whatever reason, the content flopped. And it’s not always you. Or your audience. Or the fact it wasn’t the right time.
It can be frustrating when this happens because your next piece could blow things out of the ballpark. The important thing is to continue to get that content correct regardless. And then other things may fall into place.

Link Building10. Links

Again? You’re going to talk about links. Again!
Yes, I am.
Links continue to be a major ranking, factor, whether you like it or not.
Google’s John Mueller has recently continued to reinforce the point that “a link is a link.”
Not exactly.
There are links that are bad for Google, according to their Google Webmaster Guidelines. Links that are spammy by nature are bad.
You should not acquire a link if it is going to leave a spammy footprint on your site’s link profile.
Links should come from non-spammy, high-quality authority sites in your niche. Again, the key word here is quality.
Getting a bad link every once in awhile is not a huge deal. Where it becomes a huge deal is when you do something like this over and over, and you saturate your link profile with bad links. This can come back to haunt you.
The key to a good link profile implementation is to ensure that you vary your profile a bit, that you don’t always go after one type of link, and that you keep a healthy ratio of types of links that you do go after.
I know, it’s not a prescription-based situation that you can easily say “this link is black-and-white, de facto going to help your site get to number one on Google”. Not going to happen. But these are some general guidelines that should help point you in the right direction toward best practices to follow when you do build your next link profile.

11. The Most Important SEO Factor to Get Right: Taking Action

You can have the best-laid plans in the world. You can have the best, most awesome website idea out there. But, without taking action to make that website a reality, you are just another loser in internet land that wishes they could get rich quick and live the American dream.
Once you take action, however, and you have that project – then get the rest of these SEO factors right.
Then you will be raking in that cold, hard cash. Maybe. I can’t make any guarantees on that.
But, enjoy the thrill of the chase anyway.


Reference:https://www.searchenginejournal.com/most-important-parts-of-seo/254225/?ver=254225X2

Thursday, May 24, 2018

10 Facts You Think You Know About SEO That Are Actually Myths

10 Facts You Think You Know About SEO That Are Actually Myths

SEO is renowned for misinformation, misunderstandings, and misconceptions. This is in no small part due to Google being somewhat of a black box in order to try and limit gaming of SERPs.
However, in recent times, Google has taken noticeable steps to become more transparent through increased activity in the SEO community.
Whether it’s regular Webmaster Hangouts, speaking engagements at conferences, or insightful metaphors on Twitter, the likes of John Mueller, Gary Illyes, and Danny Sullivan at Google are helping to dispel the myths of SEO with facts.
To further banish these myths, I’ve put together a list of 10 common misunderstandings about Google and SEO and why they’re wrong.

Myth 1: Google Penalizes Duplicate Content

The duplicate content penalty doesn’t exist. Google doesn’t penalize sites for having duplicate content.
Google understands that duplicate content is a natural part of the web and aims to index the highest quality and most relevant page so that searchers aren’t repeatedly presented with the same content in the search results.
Unless a site is trying to manipulate rankings and is entirely made up of duplicate content, the worst case scenario resulting from duplicate content is that similar pages are folded together in the index and an alternative version of the page is shown instead.
SEO professionals can provide search engines with a number of signals as to which page they want indexed, including correct use of canonicals, sitemap inclusion, and internal links pointing to the preferred page.

Myth 2: Google Respects the Canonical URL as the Preferred Version for Indexing

Just because you set a URL as the preferred version for indexing via a canonical tag, it doesn’t mean that this page is the one that Google will select for indexing.
The rel canonical is treated as a signal by Google for the preferred page and isn’t always respected.
Such instances can be found in the new version of Google Search Console in the Index Coverage report under the flag ‘Submitted URL not selected as canonical’.
Google may choose a page other than the one you have selected as the canonical when they judge another page in a set of duplicates to be a better candidate to show in search.
In such cases, it would be advised to consider whether the canonical page that you have selected is actually the one you want indexed. If it is, then you will need to look at the signals discussed previously (sitemaps, internal linking etc.) to check that they are pointing to your preferred version.
The key is to ensure you’re sending Google consistent signals as to the preferred version of the page.

Myth 3: Quality Updates Result in Algorithmic Penalties

In a recent interview, former Google engineer Fili Wiese spoke about the myth of algorithmic penalties:
“One misconception that often exists is around things like Google Panda or Phantom as the industry called it, or Fred. Quality updates basically. But people think those are penalties or algorithmic penalties (when they’re not).
The thing is there is no such thing as an algorithmic penalty, it’s actually a recalculation. It’s like a big black box with the formula inside, you put something in, something comes out, and what comes out is the rankings and what goes in is your website.
The algorithm changes are just basically changes within the black box, which means that what comes out on the other side is slightly different now. Does that mean you’ve been penalized? No. it may feel like it, but you’re not penalized.”
This is a subtle difference that Wiese raises, but an important one in understanding how Google’s search algorithms operate.

Myth 4: Google Has 3 Top Ranking Factors

This was big news in March 2016 when Andrei Lipattsev announced that links, content, and RankBrain made up the top 3 Google ranking factors.
However, Mueller has since dismissed this statement in a Webmaster Hangout, saying that it isn’t possible to determine the most important ranking factors because this changes from query-to-query and from day-to-day.
It isn’t helpful to focus on individual ranking signals because search engine algorithms are too sophisticated for this to be a useful way of conceptualizing algorithms.
Instead, SEO pros should focus on optimizing their sites to improve user experience, match user intent and, more broadly, improve site quality while keeping up to date with Google’s latest developments.

Myth 5: Google’s Sandbox Applies a Filter When Indexing New Sites

A further misconception comes in the form of how Google treats new sites in the index. There is a long-held belief among some in the SEO community that Google applies a filter to new websites in order to stop spammy sites from ranking soon after launch.
Mueller put the Google sandbox to bed in a Webmaster Hangout, when he said that there was no such filter being applied to new sites.
He did, however, say that there may be a set of algorithms that might look similar to a sandbox but that they attempt to understand how the website fits in with others trying to rank for the same queries.
This, in some cases, may mean pages rank higher or lower for a period of time while Google’s algorithms work out how they fit in with competing pages.

Myth 6: Use the Disavow File to Maintain a Site’s Link Profile

One staple of an SEO’s responsibilities has historically been pruning a site’s backlink profile by disavowing low-quality or spammy links.
Over the years Google’s algorithms have gotten better at understanding these types of low-quality backlinks and knowing when they should be ignored. As a result, the need for SEO pros to maintain and update a disavow file has diminished significantly.
At BrightonSEO in September 2017, Illyes stated that if backlinks are coming in organically to a site, it’s extremely unlikely that the site will receive a manual action. Illyes went on to say that he doesn’t have a disavow file for his own personal site.
Now it is only recommended to make use of the disavow file when a site has received a manual action, in order to remove the offending links.

Myth 7: Google Values Backlinks from All High Authority Domains

Successful link building should be judged by the authority and relevance of the backlinks pointing to the target site. Still, backlinks from high authority domains are highly sought after regardless of how relevant they are to the target site.
However, another insight gleaned from Illyes’ BrightonSEO Q&A revealed that Google takes into account the context of backlinks, meaning that SEO pros should perhaps give more importance to link relevance when going after links.
Illyes thinks there is value in fixing internal and external links, but it is important to keep context in mind. If a poor quality article (which has nothing to do with your site) links to your site, Google will ignore it because the context doesn’t match.

Myth 8: Google Uses Page Speed as a Major Ranking Signal

Google has used site speed as a ranking signal since 2010 and intuitively you would think they have a clever way of incorporating it as a key part of their algorithms, especially as it has become such an important topic in SEO.
However, Mueller has explained that, while there are plans to introduce a speed update later in 2018, Google only uses speed to differentiate between slow pages and those in the normal range. In fact, DeepCrawl (disclosure: I work for DeepCrawl) found that Googlebot will crawl and index pages that take up to three minutes to respond.
Speed may also indirectly influence rankings through feedback from user experience, like visitors bouncing from a page that takes too long to load, but as it stands Google’s use of speed in their algorithms is rudimentary for the time being.

Myth 9: Fred Was an Algorithm Update Related to Link Quality

Google continuously update their search algorithms at an average rate of 2-3 per day. The Fred algorithm update in March 2017 was thought to be an update related to link quality.
However, Illyes has made it clear that there was no specific algorithm update like Panda or Penguin – in fact, he called the ranking fluctuations Fred as a joke.
Illyes went on to say that 95-98 percent of these ongoing updates are not actionable for webmasters. Fluctuations always happen but you should focus on having a high-quality site with lots of people talking about your brand via links, social networks, etc.

Myth 10: Crawl Budget Isn’t an Issue

Crawl budget is a complex and much-debated topic, but it is overlooked by some who overestimate Google’s ability to crawl all the pages on a given site.
Google can crawl all of the pages on a given site at once for small to medium sites (up to around 200,000 pages). However, crawl budget is a pressing issue for those managing large enterprise sites because they need to ensure important pages are being crawled and on a regular basis.
One way to check if Google has crawled the vast majority of your site is by looking to see if Googlebot is crawling lots of 404 pages, as this indicates most of the important pages have already been crawled.

Conclusion

I hope this post has helped to kill off some of the long-held sacred cows of SEO. When it comes to SEO, there are facts and myths. Make sure you’re focusing on facts, not fairy tales!

Reference: https://www.searchenginejournal.com/seo-myths-facts/253609/

Wednesday, May 23, 2018

A Complete Guide to Facebook Page Optimization

How to Completely Optimize Your Facebook Page

Facebook is the most popular social media platform used by businesses.
Facebook Pages help your brand or business promote and share its value-add and to assist in customer support.
Facebook remains the primary platform for most Americans. Two-thirds of U.S. adults now report that they are Facebook users and 74 percent of Facebook users say they visit the site daily.
Despite the recent criticism of Facebook’s data privacy practices, both daily and monthly users are up 13 percent year-over-year.
What does this mean? Facebook isn’t going anywhere anytime soon.
You should continue to make Facebook a part of your overall marketing mix – to reach your existing and future customers.
Features like location Pages, Messenger, Featured Images, and Boost are only a few of the many things you can do to optimize your Facebook page.
Use this guide to make sure you’ve set up your Facebook page correctly, and optimized all possible areas of the platform to get the best results for your business.

Facebook Marketing Basics

Yes, Facebook is free, but your Facebook Page is by no means a substitute for your own website. A website is the only place online you can truly control your message.
Your Facebook Page serves as a micro-site within the platform that complements and perhaps highlights glimpses of your brand.
Capturing your brand name in Facebook and other social media platforms will definitely help get your name out there digitally, as social media profiles are often the top rankings in the search engine result pages (SERPs).
When doing a search for “Sanitas Medical Centers Tampa” I am presented with a bunch of results on the first page. Apart from the domain and search directories popping up on the first page, their Facebook location Page for their Tampa location shows up.
optimize facebook page for search
It’s also important for you to keep search engine optimization (SEO) methodologies in mind when developing and optimizing your Facebook page.
Adding important brand and non-brand long-tail keywords should also be sprinkled throughout your Facebook page, as well as your post updates.

Creating Your Facebook Page

When creating a Facebook page, it’s important to pick the right type of Facebook page right off the bat. You can choose from:
  • Local Business or Place: Only choose if you have one location. That said, don’t freak out if you have one location now, but will have more in the future. Keep on reading to find out more about Facebook location Pages.
  • Company
  • Brand or Product
  • Public Figure
  • Entertainment
  • Cause or Community
create a facebook page
Setting it up properly the first go-round will enhance the way you communicate the message you wish to show.
When you’ve picked the type of page you wish to create, simply visit Facebook.com/pages/create and begin the process.

Location Pages

If you have a business with one location, you can start off with a location page. However, if you expand your locations, there are various things you need to do in order to make that happen.
The good news: you don’t need to ask Facebook to request Facebook location Pages anymore. If you’re doing this for the first time, and your main page has an address, you will encounter a “warning” message.
The reason? You will have multiple business locations.
Your main Facebook Page should be the main or “parent” page for your brand. The child’s pages are your location pages.
facebook location pages layout
After you’ve gained access to your location pages tab, fill each location out with their proper:
  • Name
  • Address
  • Phone number
  • Username
  • Category
  • Website address
  • Email
  • About
As stated before, make sure to use keywords you’re trying to rank for in search. If you’re an urgent care facility, use words like “urgent care” or “medical center” throughout your copy.
Another great benefit of location pages is that you can implement ratings and reviews. You have the option to hide these, but it’s best to show these because reviews play a huge part of digital marketing, local marketing, and SEO.
Just make sure you have the sufficient process in place to best triage and respond to reviews. If you’re ready to implement, head to Facebook and follow these steps.

Keep Business Operating Hours Accurate

It’s essential to enter your business hours, but it’s equally important to update them whenever they change.
When adding new location pages, make sure that the correct hours of operation and days open are correct, as some may vary.
Unlike Google My Business, Facebook does not let you customize hours for holidays or other special events. If you have custom hours for holidays and special events, then take advantage of utilizing Facebook posts or advertising to convey this message.
For example, if you’re experiencing inclement weather or have a special event, promoting a new product you now sell, create visually appealing posts and pin them to the top of your Facebook page, so it’s seen by those visiting your page.

Custom Username

Having a custom username (or short, user-friendly URL) for your page makes it more convenient for users to find your Facebook Page in search when it has a unique username.
When you start out your Facebook domain will have various numbers after it and look something like:
http://www.facebook.com/pages/your-brand-name/857469375913?ref=ts
Not very friendly or memorable.
You should keep your brand’s name at top of mind. If you’re optimizing a Facebook location page, then I highly recommend using the brand name plus the location in the username.
Facebook Pages with usernames are also allowed to create custom URLs that enable people to quickly visit and message them.
As you can see in the example below, if you search for “@LLBeanLynnhavenStore” you will be able to message or visit that custom location page.
custom facebook page url
Some other things to keep in mind when creating a username:
  • You need to be an admin to create a username.
  • You can’t use any spaces or underscores but you can have periods separating words.
  • Capitalize the words to enhance readability and won’t affect if people type in lower case letters.
  • Usernames can be a maximum of 50 characters.
  • At least 5 characters long.
Want a custom username for your Facebook page? Follow these steps.

Profile & Featured Image

Facebook Pages give you a great opportunity to reflect your brand.
One way of customizing your Facebook page is using not only the profile/avatar feature but taking full advantage of the featured image section.
You can now not only just upload a photo in the featured image/banner section, but now you can be creative and utilize video or create a slideshow.
cover photo facebook example
This is a great place to promote testimonials or your team that serves your customers.
According to Facebook, “Cover photos can’t be deceptive, misleading or infringe on anyone’s copyright.”
Read through Facebook’s guidelines to get a better idea of what you should abide by.

Call to Action Button

Right under the cover photo, you can also add a call to action (CTA) to encourage your users to interact either on the page or help learn more about your business.
What your brand does will determine the best CTA.
For example, if you’re an urgent care facility and have the opportunity to have facility leaders communicate with your patients, and then add the “Call Now” feature versus a “Sign Up.”
  • Go to your Facebook Page.
  • To the bottom right of your featured post, you will see the CTA button in blue.
  • Click on that button and then you will be able to pick which CTA you prefer.
call to action button facebook page

Managing Customer Reviews & Comments

Engaging and interacting with your customers is an integral part of social media.
Facebook is a great platform where you can provide great service (responding and assisting customers) and also discover new ways to improve your business.
You need a strategy around triaging and managing comments and reviews, so don’t take this section lightly.
If you are prepared to not only respond to your comments on posts and ads, you can implement reviews on your page. To do this, simply:
  • Go to your Facebook page.
  • Click on Settings.
  • Under General > Reviews.
  • You then can Allow visitors to review this page or if you’re not quite ready for this, then simply click on Disable Reviews.
Reviews are a great way to show off how well your business is doing.
If you get a less-than-great review, be sure to respond. This shows consumers that your brand is engaged and cares about making them happy.

Messenger

Facebook Messenger, like reviews, is another great way to show off how well your business willing to provide great service and support in various mediums. Messenger is just another way your consumers can connect with you.
Again, you need to know your bandwidth. If you’re willing to implement this step, it requires a strategy. You must consider how quickly you can respond to your messages.
Your responsiveness rate will appear on your page. It shows how efficient you are at responding to customer inquiries.
If you’re ready to implement this step, go to:
  1. Go to your Facebook page.
  2. Click on Settings.
  3. Under General > Messages.
  4. Then click the button that says Allow people to contact my Page privately by showing the Message button.

Organizing Your Page Tabs

While this is kind of obvious, if you end up implementing Facebook location pages over and over, this step has to be repeated. Not only can you arrange these tabs, but there are some you can even turn off and on.
The reason why you want to take a look at this section is that some tabs may be more of a priority for your business, depending on what you do.
You can also utilize the templates that Facebook provides, which can also take the guesswork out of how to organize your tabs.
  • Go to your Facebook page.
  • Click on Settings.
  • The on left, click on Edit Page.
  • Once in the Edit Page area, you will be able to see the various templates, and also place various tabs in order by dragging the three-lined icon to the left of the tabs and moving them around.

arrange facebook tabs

Claiming Unofficial Pages

Have you ever searched for your brand’s name and found other pages with the brand name (along with a map, reviews, and ratings) – yet you don’t have access or control of the page?
Well, this is an annoying, yet solvable issue.
Facebook creates these pages, which are designed to act as a placeholder. This gives a visitor an area to check-in and leave reviews and comments about the location.
Unfortunately, sometimes Facebook is a little too eager to do this as some businesses already exist for that specific location.
So what do you do?
The best way to solve this issue is to claim and merge (if needed) these unofficial Facebook pages. Doing so can give you complete control over your brand on Facebook.
Note: Just because you implement the following steps once, does not mean you’re done. Checking for unofficial pages should be a regular part of your overall social media maintenance.
If one of these pages appear when you search for your brand name, you will need to claim it before you merge the page with your verified Facebook page.
Here are some ways you can claim the unofficial page:
  • Verify via phone call
  • Email
  • Utility bill/Phone bill
  • Business license
  • Business tax file
  • Certificate of formation
  • Articles of incorporation
The fastest way to gain access is with a utility bill, especially if you work for an agency and you aren’t physically in the place of business.
Once you verify the page (which takes up to 24 hours), you’re ready to move onto the next steps:
  • Go to your unofficial duplicate page on Facebook that you just claimed.
  • Select Is this your business? from the drop-down menu.
  • Choose the option Merge into a verified page you manage.
  • Select your page from the drop-down and submit.
Note: If you have multiple location pages, make sure you merge with the correct page.
Even if you don’t have location pages, you can still use the above process to merge an unofficial Facebook page with the one you are managing.

Reference:https://www.searchenginejournal.com/optimize-facebook-page/253335/?ver=253335X2

Friday, May 18, 2018

6 Easy Ways to Attract More Website Traffic

Optimizing for mobile and customized email marketing are just two of them.

6 Easy Ways to Attract More Website Traffic

Traffic is the lifeblood of any online business. And success is difficult to achieve without it. No matter how much time, effort and money you've put into building your website, if you're not getting traffic, the value of your site drops because of all those potential customers who never see it. And that's just bad for business.
Related: 3 Super Simple Pinterest Strategies to Quickly Grow Your Website's Traffic
So, given that driving more traffic to your site will increase your online business's odds of success, you jhave to figure out: How do you do it?
Based on my time building and growing numerous websites, here are six tried and tested techniques I've found work in driving traffic to your website.

1. Recognize that content is king.

You may not see the results overnight, but a robust content marketing strategy is one of the best ways to increase traffic to your website in the long term.
In the past, this may have meant stuffing your page with keywords in an effort to artificially boost your search engine result page (SERP) ranking. But Google now explicitly advises against this. While it's still important to create SEO-friendly content (Wordstream has a helpful guide on how to do this here), Google's increasingly sophisticated search algorithms do a better job all the time of "sniffing out" quality.
Shortcuts, like keyword stuffing to outsmart Google's algorithm and increase a page's ranking, have not only become ineffective, but Kissmetrics warns that they may actually lead to your site being penalized by Google. Additionally, quality content is far more likely to be shared, resulting in more back links to your website. Back links not only drive more organic traffic, they also improve SERP rankings.
According to SearchEngineWatch, results on the first page of Google receive 92 percent of all traffic. Organic traffic tapers off precipitously from there. Improving your organic search results by creating quality content is one of the best ways to drive more traffic to your site.
Related: 50 Easy Ways to Drive Traffic to Your Website

2. Get social.

Being active on social media is one of the best ways to stay engaged with your audience and drive traffic back to your website. Hosting giant GoDaddy found that 61 percent of its high-traffic sites had an attached Facebook page. While having a Facebook page and a Twitter account is more or less considered a requirement for online businesses today, don't neglect the less-established platforms.
Let's say, for instance, that your business is primarily B2B. In that case, LinkedIn can be a gold mine for leads. Does your business sell products with a strong visual identity? Instagram lets your pictures tell a thousand words. Digiday notes that organic reach on Facebook is becoming ever harder to achieve, so expanding your social media footprint is one of the best and most cost-effective ways to reach your customers.

3. Optimize for mobile.

In May 2015, Google announced that the volume of searches on mobile devices had surpassed those on desktops for the first time. This trend has continued, and with mobile devices getting faster and more sophisticated, there's no reason to think it will abate any time soon. Not surprisingly, Google now factors into its SERP rating how mobile-friendly a website is. It even offers a free tool that can tell you how mobile-friendly your website is.
In addition to the effect mobile-friendliness has on your website's SERP ranking, it can also influence consumer trust in your business and the likelihood that people will recommend it. Google found that 89 percent of people are likely to recommend a brand after having a positive brand experience on mobile. Even in this digital age, word of mouth is a powerful tool for driving traffic to your website.

4. Optimize for speed.

Another factor that not only affects SERP ranking but greatly impacts usability is page speed. Nobody likes to sit around waiting for a page to load. According to Kissmetrics, 40 percent of people abandon a website that takes more than three seconds to load. One of the most common culprits when it comes to slow page-load times is image size.
Free tools such as ImageOptim make it easy to compress your images before you publish them on your website. Depending on what platform your website is built on, plug-ins like Smush for WordPress can optimize all your images retroactively. If your website is image-heavy, this can substantially improve its performance.
Once again, Google offers a free tool that gives you insight into how the speed of your website measures up.

5. Email marketing.

Virtually as long as there's been email, there's been email marketing. It's become so ubiquitous that on occasion observers have predicted its demise. While cold-emailing may be on life support due to the efficiency of spam filters and regulations like GDPR, marketing to a list of engaged subscribers remains one of the most efficient means of driving traffic to your website.
What better way to communicate about new products and services or content then by sending timely, relevant and personalized emails to your subscribers?
If you're in the business of ecommerce, automated email marketing tools like MageMail can help significantly boost your sales. These solutions allow you to retarget customers who have browsed your site or added items to a cart without completing a purchase. Abandoned-cart emails have an astonishing average open-rate of 40 percent if sent within three hours of abandonment, according to Business Insider.

6. Pay-per-click and social media advertising.

While organic search may provide better ROI for your business in the long term, paid search can potentially deliver results more quickly. A well-thought-out and executed pay-per-click (PPC) campaign through Google Adwords can lead to dramatically increased traffic.
Be sure to do your research, though. PPC campaigns can quickly become expensive if insufficiently planned and targeted. Keyword Planner, from Google once again, is an invaluable tool, but don't stop there. Ahrefs can help pinpoint exactly what your competitors are doing with their paid search campaigns. SEMrush can show you competitor budgets, best keywords and their most profitable ad copy. Armed with this knowledge, you can adjust and improve your PPC campaigns accordingly.
I've already discussed the importance of having a robust social media strategy to drive more traffic to your website. Increasingly, though, creating and sharing quality content on your social media channels is no longer sufficient in itself.
This is particularly true of Facebook, where recent changes to the newsfeed, dubbed "Facebook Zero," have made it even harder to reach followers organically. Enter Facebook ads. Utilizing its vast stores of customer data, Facebook allows you to really drill down on your target audience, serving ads only to the demographic you define.
Related: How to Make More Online Sales With a Low-Traffic Website

Final thoughts

It's difficult to argue that it's never been easier to launch an online business. One consequence of this is that it's never been harder to stand out from the crowd. Put the six tried-and-true traffic-building strategies outlined above to work for your online business today. The search-engine results will speak for themselves.

Reference:https://www.entrepreneur.com/article/311774

Thursday, May 17, 2018

9 Essential Tips to Make Better PPC Reports

9 Essentials to Make Perfect PPC Reports for Your Clients

PPC reporting can get complicated and cumbersome in a hurry. The gap between what a client wants to know and what a PPC analyst wants to provide is often wide and hard to span.
The weekly, monthly, and quarterly report cycles can become routine, automated processes that get stale over time.
We often know or feel that we need to improve our reporting systems, templates, and methods, it can be hard to take that step.
Successful client relationships are built on trust, and trust is often built on how success is defined, measured and communicated.
A lot goes into a strong client relationship. Much of the communication happens through reporting.
Being a good (and truthful) storyteller goes a long way toward effectively communicating and ensuring that you and the client are on the same page.
With search marketing being a mature industry, it is hard to get away with using confusing metrics and acronyms as we can typically tie PPC activities directly to conversion and business goals.

1. Be Transparent & Consistent

Numbers don’t lie, but they can be used to tell a different story than what is actually happening.
Having an honest and open approach to sharing performance – whether or not it is going well – is the first essential for the perfect report.
When you share the same stats and details each reporting period, you build trust and can have an honest conversation about what is working, what isn’t, and how the strategy is evolving.
When you have a consistent reporting format, it becomes familiar and easy to follow for your client.
Plus, when you report on performance on a consistent basis whether weekly, monthly, or quarterly, expectations are set as to what to expect and when to expect it.
This makes the report more powerful as you can use it as a tool rather than having to scramble to pull details together when the client asks or have to send over smaller updates that lack context and remove your control of the process.

2. Organize the Report from General to Specific

Keep high-level content up front and ease into the fine details. Putting summary info and high-level stats ahead of ad groups, ads, and keywords will keep your client engaged.
You likely have some clients who don’t go beyond the first page or two of your report while others want to consume every detail.
By starting with general and high-level content and working down into details stats and components of the campaigns, you’ll ensure you keep all types of clients engaged.

3. Start with Goals

While we often want to start with our core PPC metrics (e.g., impressions, clicks, conversions, click-through rate), our clients often prioritize ROI and ROAS.
Know what your client cares about the most and how they are reporting to their stakeholders. Make that the first thing you report on.
PPC stats matter, but not as much as our client’s definition in the bigger business context as to whether our efforts are working.

4. Have a Dashboard or Executive Summary

Always assume that your report is going to be forwarded or passed along to people who don’t normally meet with you.
If you have never met with the C-suite, run your report by a filter of whether someone you have never met and who doesn’t know PPC metrics can understand what you are doing and if it is working. That is best done in an executive summary or dashboard at the beginning of the report.
If you have the report tailored to start with client goal reporting (versus PPC metrics) and start with high-level content, then this naturally will fit into the first page or two.
Use all information at your disposal to answer the question of whether we’re hitting client goals, what the strategy is, any highlights from the previous reporting period, and what you’re doing during the next reporting period.
If you write this in a way that someone who has no idea what PPC is and has never met you can understand it in basic terms, then you have succeeded.

5. Provide Definitions

Remember that your client may not necessarily have every acronym memorized or remember how each stat is calculated.
Include a key under each table or figure or include a standard definitions section that is helpful for the client to reference, but not insulting or in the way if they are well-versed.
By consistently including definitions you can make the stats and subject matter more approachable for your client and over time you can get deeper into specifics.

6. Segment Performance Data by Intent

Not all keywords are intended to directly drive a conversion. Don’t forget about attribution modeling and the customer journey in your report.
If you focus on conversions, but not all campaigns, ad groups, or keywords are focused on last-click attributed conversions, then be sure to segment your report accordingly.
You can inadvertently make your performance look bad if you’re highlighting conversions throughout the report as a goal, but not all campaign activities are actually expected to drive a conversion directly.
Consider including reporting based on steps in the customer journey, separation of brand versus generic terms, including assisted conversions and revenue data, geographic targeting, and other features that allow for fair judgment of performance per segment based on intent or expectation.

7. Aggregate Everything Possible

Google AdWords does a great job of rolling up stats for you.
However, if you’re running ads in Bing and/or additional advertising platforms, it can be helpful to aggregate stats to show what PPC is doing across all networks. This can be done using reporting software or your own manual methods.
One area that is often lost that Google and Bing don’t see is any third-party call tracking data that you have. You don’t want to miss phone conversions and have them left out of your PPC reporting.
By aggregating data in your report, you can paint the bigger picture at a high-level as well as remove the need for your client to have to do the math and their own stats based on the report you provide.

8. Get Detailed (With Permission)

Often we use our reports in a meeting or conversation about our strategy and efforts.
My team uses the report as the agenda for monthly calls and it is a great time to go over details and get feedback. Some clients love to see every keyword while others just want the executive summary.
Even for those that only want the executive summary, we often need their feedback on items that are more granular like ad copy or keyword targeting. When that’s the case, it is necessary to include what might be dozens of pages of additional granular detail to reference.
When you want to or need to include granular detail, do it at the end in an appendix or in a separate supporting document.
Don’t put pages of keywords into the middle of a report as that damaged the general to specific flow and hurts the impact of the overall story you’re trying to tell of strategy and performance in your report.

9. Integrate Data Beyond the PPC Conversion

In many client relationships, the PPC conversion is a lead, engagement, or something short of a completed sale producing trackable revenue.
When that’s the case, if you can work with the sales team, get CRM access, or find a way to get feedback on the leads PPC is driving or the activity it is generating, you can do more to show the true impact of your efforts.
At the very least, you can get valuable feedback that will influence your decisions while managing the campaign in real time versus waiting for anecdotal feedback from the client after the fact.
Getting this data as quickly as possible helps avoid a situation where you’re driving what looks like good PPC leads, but learn later that none of the leads qualified or closed.

Conclusion

Including the nine essentials in your PPC reports help you to make the reporting process a meaningful one and much more than just a routine activity.


Reference:https://www.searchenginejournal.com/ppc-report/253021/?ver=253021X2

Wednesday, May 16, 2018

Why You Should Think of New Ways to Approach SEO


New Ways to Approach Technical SEO: A Necessity, Not an Option

The practice of SEO has changed more than any other marketing channel over the last decade.
Through a succession of algorithmic evolutions, SEO has also remained the foundation of a successful digital strategy – 51 percent of online traffic arrives at websites via organic search, after all.
SEO has gone mainstream.
Still, we must take stock of the fact that SEO in 2018 requires new skills and approaches to succeed in an increasingly competitive world.
With more than 5,000 devices integrated with Google Assistant and voice search on the rise, the focal points of search have become decentralized.
The SERP as we knew it is long gone; search is dynamic, visual, and everywhere now.
This has a very significant impact on organizations, as SEO is a collaborative discipline that requires a synthesis of multiple specialisms to achieve optimal results. At the heart of this lies the domain of technical SEO, which has remained the foundation upon which any successful strategy is built.

A Brief History of Technical SEO

All roads lead back to technical – it’s how you now use your skills that has changed.
technical seo history from early 2000s to 2018
SEO has always entailed driving high-quality traffic through organic search.
The means of achieving this goal have altered significantly since the early days of SEO, when technical skills were dominant.
Crawlability was then – as it is now – a foremost consideration when setting up an SEO strategy.
Content was secondary – a vehicle to include keywords and improve rankings. This evolved over time to encompass link building, based on Google’s key early innovation of using links to evaluate and rank content.
The goal of marketers remained constant: to attract organic search traffic that converted on their website.
As a result, we endured a cat and mouse game with some marketers doing whatever it took to gain high search rankings.
As soon as Google caught up with keyword cloaking, black hat SEO practitioners moved on to link buying in an attempt to manipulate their rankings. The Panda and Penguin algorithm updates put paid to a lot of those murky tactics and even (briefly) raised the discussion of whether SEO was dead.
This question missed one key point.
As long as people are using search as a means to discover information, SEO will continue in rude health. Those discussions are a distant memory as we embrace modern SEO, especially its convergence with content marketing.
The industry has gone from strength to strength and the best strategies are now justly rewarded with increased search presence.
In the process, SEO has moved from an entirely rational discipline to something more rounded, including the typically “right-brained” domain of creative content. This has changed the shape of SEO departments and demanded collaboration with other digital marketing departments.
Technical SEO, for its part, now encompasses all search engine best practices and allows no room for manipulation. This specialism never went away, but it has seen a recent renaissance as senior marketers realize that it drives performance as well as crawler compliance.
There are four key areas to this:
  • Site Content: Ensuring that content can be crawled and indexed by all major search engines, in particular making use of log file analysis to interpret their access patterns and structured data to enable efficient access to content elements.
  • Structure: Creating a site hierarchy and URL structure that allow both search engines and users to navigate to the most relevant content. This should also facilitate the flow of internal link equity through the site.
  • Conversion: Identifying and resolving any blockages that prevent users from navigating through the site.
  • Performance: A key development has been the evolution of technical SEO into a performance-related specialism. This has always been the case, but marketers of all stripes have realized that technical SEO is about a lot more than just “housekeeping.” Getting the three areas above in order will lead to better site performance through search and other channels, too.
Within this context, it is worth questioning whether “SEO” is even an adequate categorization for what we do anymore.

A New Approach: Site, Search & Content Optimization

The term “search engine optimization” is arguably no longer fit for purpose, as we extend our remit to include content marketing, conversion rate optimization, and user experience.
Our work includes:
  • Optimizing the site for users.
  • Ensuring accessibility of content for all major search engines and social networks.
  • Creating content that engages the right audience across multiple marketing channels.
According to research from BrightEdge, only 3 percent of 250 marketers surveyed believe SEO and content are separate disciplines.

We should therefore be looking at this set of skills as site, search, and content optimization – especially as the role of a search engine continues to evolve beyond the 10 blue links of old.
Our responsibility is to get in front of consumers wherever they are searching, which is an ever-changing set of contexts. This would be a more fitting depiction of a marketing channel that plays an increasingly pivotal role in digital and business strategy.
After all, when major technological trends develop, technical SEO pros are often at the forefront of innovation. This looks set to be further entrenched by recent industry developments.
Now that Accelerated Mobile Pages (AMP) and Progressive Web Apps (PWAs) are center stage, brands must ensure that their web presence meets the highest standards to keep pace with the modern consumer.
Being “mobile-first” has big implications for how we engage our audiences, but it is also a technological consideration. PWAs will soon be coming to Google Chrome on desktop, which is a further manifestation of the “mobile-first” approach to site experiences that we all need to adopt.
It would be hard to argue that these fit uniquely under the remit of ‘Search Engine Optimization’, and yet it is likely SEO pros that will lead to change within their respective organizations.
Brands need to think beyond search engines and imagine the new ways their content could – and should – be discovered by customers.
A different approach to SEO is required if we are to tap into the full potential of emerging consumer trends. That approach should expand to include site experience optimization, as well as traditional SEO techniques.
There are plentiful new opportunities for those who adapt; a process that can be accelerated by creating a collaborative working environment.

6 Thinking Hats & SEO

However we choose to label it, it should be clear that SEO has never existed in a vacuum. From its early symbiosis with web development to its latter-day convergence with content, SEO has always been about collaboration.
It is therefore helpful to consider frameworks that can bring this idea to life and bring together the specialist skills required for a modern organic search campaign.
We typically talk only about black hat and white hat in SEO (with the occasional mention of gray), but Edward de Bono’s Six Thinking Hats approach can add structure to collaboration.
Each hat reflects a way of thinking and separates out the different functions required to achieve successful outcomes. These could be entirely different departments, different individuals, or even different mindsets for one person.
The objective is to improve the collaborative process, but also to erode the fallibility of subjectivity by approaching every challenge from all angles before progressing.

1. White Hat

A well-known term for most SEO pros, White Hat thinking in this context depends purely on facts, statistics, and data points. This is the most objective way of approaching a situation.

Who Should Wear This Hat?

Data analysts and analytics specialists are typically naturals at adopting this approach.

Why Is It Needed for SEO?

Looking purely at the data is a perfect starting point for discussion. It keeps everyone focused on the objective truths of cross-channel performance. Data without context is meaningless, of course, so this approach in isolation lacks the color needed to understand consumers.

2. Yellow Hat

The Yellow Hat approach brings optimism to the table, focusing on the potential benefits a strategy may bring for brands and the consumer.

Who Should Wear This Hat?

Anyone can be an optimist, so this could be a mindset that all parties take on for a period of time. Equally, this could be handed to one person as a responsibility; the key thing is to maintain some structure.

Why Is It Needed for SEO?

We tend to have a lot of ideas, so it is easy to jettison some of them before their full potential has been explored. Taking an alternative view allows for full exploration of an idea, even if only to retain some of its components.

3. Black Hat

The Black Hat is anathema to advanced SEO pros, but the concept does have value in this particular context. We can use this interchangeably with the “devil’s advocate” approach, where someone purposefully points out obstacles and dangers for the project.

Who Should Wear This Hat?

No one really, but be aware of the dangers of people offering SEO solutions and little transparency into the how. Keep an eye out for negative SEO attacks.

4. Red Hat

The Red Hat approach relates to feelings and emotions, often based on the gut reaction to an idea. This can be very beneficial for a digital project, as we can sometimes be overly rational in our data-driven approach.

Who Should Wear This Hat?

It can be helpful to assign this role to someone who works closely with the target audience, or who analyzes and interprets a lot of audience data.

Why Is It Needed for SEO?

When fighting for vital – and dwindling – consumer attention, first impressions matter. Content marketing campaigns can depend on getting this right, so it’s worth listening to gut instinct sometimes.

5. Green Hat

The Green Hat brings creativity and spontaneity to the process, tackling challenges from a new perspective when possible. Where others see obstacles, this approach will see new opportunities.

Who Should Wear This Hat?

Anyone can be creative. However, it may be best to assign this role to someone who feels comfortable sharing their ideas with a group and is not easily disheartened if they don’t take off!

Why Is It Needed for SEO?

There are best practices, but those only take us so far. They are a leveling force; new ideas are what really make the difference. In an industry, as nascent as ours, there are plenty of trails yet to be explored. The Green Hat brings that element of innovation to a discussion.

6. Blue Hat

The Blue Hat organized the thinking process and takes ultimate responsibility for bringing together the different strands into a cohesive whole.

Who Should Wear This Hat?

The project lead or the person closest to the brand’s objectives can help keep things focused. Project managers also have a natural affinity for this role.

Why Is It Needed for SEO?

SEO is an increasingly diverse set of disciplines, which makes this role indispensable. To maximize the organic search opportunity, various departments need to be working in tandem on an ongoing basis. The Blue Hat keeps this collaboration going.
Actual hats are optional, but may help the adoption of this approach.
Regardless, these ways of thinking have a range of benefits across any organization:
  • Opportunities to integrate more digital functions into the SEO process.
  • Ways to learn new skills, both by doing and by observing.
  • Integration of SEO best practices across more digital channels.
  • A central role for SEO, without reducing the importance of other specialists.

Conclusion

SEO has evolved to be part of something bigger and technical skills must be applied in a different manner.
If anything, it has expanded into a much more sophisticated and nuanced digital channel that has outgrown the “Search Engine Optimization” category. The core tenets of organic search remain firmly in place, with technical SEO given overdue prominence as a driver of web, mobile and device performance.
SEO professionals are often at the forefront of technological innovations and this looks unlikely to change in a world of voice search, digital assistants, and Progressive Web Apps.
New approaches are required if we are to maximize this opportunity, however. That begins with the definition of what exactly SEO entails and extends to the ways we lead collaboration within our organizations.
The level of technical acumen needed for success has changed back to the levels it once was. However, where and how you apply that knowledge is key to technical success. Focus your skills on optimizing:
  • Your site.
  • Mobile and desktop devices.
  • Mobile apps.
  • Voice search.
  • VR.
  • Agents.
  • Vertical search engines (it’s not just Google anymore – think Amazon for example).
The AI revolution is begging for more help from technical SEO professionals and data scientists to help drive it forward.
If you act now and take a slightly different viewpoint on your role, organic search can assume a central role in both business strategy and cross-channel digital marketing.

Reference:https://www.searchenginejournal.com/new-approach-technical-seo/252463/?ver=252463X3