Our blog

Follow us on other social networks.

When WYS is not WYG

When WYS is not WYG

CMS Editor Quirks & WordPress TinyMCE

While I am a huge fan of CMS (Content Management Systems), I do believe they have often been oversold by web design companies. The typical client for a CMS is an office administrator who is trying to find a way to save her company money by bringing the website updates in-house.

Content Management Systems CMS

A CMS like WordPress is supposed to make those in-house updates easy for pretty much anyone with any experience level, right? Well, that’s how they’ve been sold. But it doesn’t always work out that way.

The problem is that the office administrator is familiar with office related software, not web design. Being able to create a decent looking page in Word is very different from creating a decent looking page in WordPress. I deal with frustrated clients all the time who just don’t understand why it’s not working as expected. So what’s going on?

There are a number of issues which make a CMS less easy to use than they are often sold and sometimes first appear. So let’s list them.

WordPress is not a Word Processor.

WordPress (and any CMS) is fundamentally built on web technology. So its content formatting is restricted to the way that your browser can read and display it. Word (and other word processors) don’t have that restriction. They are built using system level code. They are not restricted to displaying in a browser. They create their own display. So they can format their code anyway they like.

A good example is bolded text.

The correct way for a word processor to format bolded text is to refer to the bolded version or defined weight of the font being used. If the font doesn’t have a bolded version or weight, it will not display the text bolded.

postscript fonts

Probably in favour of user friendliness, but with the effect of driving prepress and printers crazy, Microsoft (Office and Publisher) decided to implement their own method. Instead, they use a visual only bolding effect when you hit that little bold button. The word processor bold button adds an extra weight graphic in the editor display window even if the font doesn’t have a bold version or weight.

User friendly, right? Because it looks bold, the user thinks it is bold. You can even print it to your home raster printer and it looks right.

Printer friendly? Not so much. When you send it to a true postscript printer, such as those at a professional print house, the bolded texts print without being bolded.

Why is that? Because there really is no information written to the actual print file associated with the visual file regarding the bolding. It was a “user friendly” visual trick to satisfy the majority of home users and printers. The users perceive, and the printers receive, graphics only (not true postscript) from the software.

Good magic trick, right? Well, it’s fine until you take it out of context.

Something analogous is going on when it comes to the design and layout web pages. When the source content comes from a word processor or word processor user, it’s formatted for a completely different context. The WordPress content editor has a bold button which looks like Word’s. But the way it functions on the code behind the visual display is very different.

How to copy and paste from a word processor.

It’s common and good practice to create your website content in a word processor before copying it to your website. I do it all the time. My main reason is that I want a backup not on my website. But I also like my autosave on my word processor better than the one in the editor CMS (if it even has one).

Unfortunately, it’s not as easy at it seems. And there are no warnings telling you something is going wrong. Or going to go wrong. What happens is, at some point, when you start to try to make changes to the web page, perhaps to fix some aspects which didn’t copy so well, the layout goes all quirky. Whatever you try doesn’t work. And often makes it worse!

This is by far the most common problem and frustration I’ve heard. To understand the problem, you need to understand how the content styles are being copied from your word processor to your CMS editor. Just because you can do it, doesn’t mean it’s going to work as expected.

When you copy content from your word processor, your text is being saved in your invisible OS clipboard so it can be used in other programs or windows. It doesn’t copy the native DOC format because that would be useless outside of Word. It doesn’t copy the visual display, since that is basically a graphic. It formats the copied content as RTF (rich text format).

RTF is similar to HTML (the language of web pages). But they are not the same. They are similar enough that the web content editor can interpret most of the RTF and convert it to HTML. So, like the problem above, from a visual point of view, they usually look almost identical. But under the hood, the code is usually quite a lot different, usually the RTF includes a lot of unnecessary nesting. And that’s what get’s you into trouble later.

HTMLView

For you, the problem arises when either they don’t appear as expected, or changes are made and those changes don’t appear as expected. The main difference between RTF and HTML standards for web pages is that RTF includes all its style properties “inline”, while the HTML standard is to include class references to CSS styles (what tells the web browser how to display the content) defined elsewhere. HTML standards decided to separate the content from its CSS formating. But because RTF is an inter-document exchange format, it needs to encapsulate all its content and formatting (and images) within the same document.

For example, consider the humble paragraph. Here is what your paragraph HTML might look like when it is copied from Word:

<p class=”p1” style=”color: #000; font: times, serif 12pt normal; line-spacing:1.5; margin-bottom: 12px”>Here is an example from Word with <span class=”s1” style=”color: #000; font: times, serif 12pt bold;”>bolded text</span> copied into WordPress.</p>

And here is what it should look like:

<p>Here is an example from Word with <b>bolded text</b> copied into WordPress.</p>

The only good solution is to use the Paste as Plain Text button in WordPress. Or if you have already pasted it, use the remove formatting button. That remove formatting button will save you when you get into a mess and don’t know how to fix it. However, you then also lose all your HTML formatting. You’ll be starting from scratch. But that’s actually a good thing. That’s what you want to do. You were taking a dangerous shortcut. Stop and start over.

Here is an example from Word with bolded text copied into WordPress.

Removing and reformatting your content is a better solution in the long run because you don’t start stacking and nesting tags on top of each other with unpredictable results. WordPress should handle the paragraph tags automatically. For other simple formatting like bold, italic and underline, you can go through your text in WordPress and re-apply them where needed there. WordPress should add the proper cleaned HTML. Links are another formatting element that you will be better to add in WordPress after pasting them as plain text.

Keep your layout simple, stupid.

KISS2

Another related common problem occurs when the office administrator is over-ambitious with their design and layout. They’ve learnt how to wrap text around images, create columns and tables, and love to play with font colors, sizes and styles. The problem is that these more complicated layout and formatting don’t translate very well. And they are not all that simple to recreate in WordPress if you don’t know the proper HTML and CSS method for them. For consistency with your theme formatting, and to avoid formatting issues, it’s best to keep your design and layout as simple as possible.

When formatting, always remember KISS!

The default WordPress editor is TinyMCE, mostly.

WordPress is packaged with the open source, multi-platform, WYSIWYG editor, TinyMCE. It’s the standard default editor used by nearly every CMS. WordPress has customized it, however, based on what they feel are its users’ needs.

tinymce

For one, they’ve trimmed down its functionality quite a bit. That’s a good thing in one sense because it prevents content editors from over-stylizing their content. There’s no font, font size, and color buttons, for example. The editor is restricted to the styles defined by the WordPress theme CSS for the HTML. While that might be frustrating to the office designer who wants to do something more fancy to make it stand out, it does function to reign in the content formatting so it is consistent with the rest of the website.

The worst website design is one where every page is formatted different using many inconsistent colors and font properties.

The default WordPress editor has no buttons for creating tables or columns, however. And other advanced layout features like text-wrapping around images is not straightforward. TinyMCE is not designed for layout. It is designed for very simple text and image content.
Part of the reason for this is the philosophy that the layout should be provided by the theme. That doesn’t always help, since the theme has very little control over much of your content inside the post outside formatting the standard HTML elements. It could automatically put the entire content of a post in columns, for example, but not select parts.

To get around these limitations, many WordPress users have resorted to plugins. One of the most popular editor plugins, currently listed with 1 million+ active installs for example, is the expanded implementation TinyMCE Advanced. Some of the popular added features include:

  • Support for creating and editing tables.
  • More options when inserting lists.
  • Search and Replace in the editor.
  • Ability to set Font Family and Font Sizes.

As a web designer and developer, I shudder when I see those features available to content editors who have no HTML and CSS knowledge. I still have nightmares from some layouts I’ve had to fix because the user just didn’t understand how to undo what they already did, stacking one mistake on another. Unless you have an understanding of the HTML implementation of those features, they are going to get you into trouble. It’s far safer, and often cleaner, to keep your website layout and formatting as simple as you can.

The difference between responsive and static design.

Another reason not to use those advanced layout features in WordPress is that they are not all automatically responsive and mobile safe. It’s a common misconception that just because your theme is advertised as responsive and mobile friendly, all your content will be as well.

Text-wraps in WordPress, for example, typically use CSS “floats” which are not automatically responsive. That means that when you change the width of your browser, the image might stay the same size, but the amount of text wrapping could decrease. On some devices, that might end up being something which looks silly, like a vertical string of single word lines.

Tables are another example of something we rarely use in responsive web design these days. In the old days, they were often used as page layout tools. It’s not uncommon for office designers to use tables for laying out their Word documents. That’s fine for static fixed width documents, typical for print. We even still use tables to format static fixed width HTML for email.

But for responsive websites, tables don’t work for layout. Instead, we use specially styled boxes which can adjust their size and even the number of columns based on the device’s browser window width. With these, we can get precise control over how the layout responds to the browser width, such as when an image wraps the text right and when it goes full width and drops the text below it.

The downside for the content editor is that you don’t get these advanced responsive design elements, not even with TinyMCE Advanced. We edit the HTML directly to get these. There are plugins which will allow drag and drop page layout building, but none available for WordPress that I would recommend to the regular content editor (a topic for another post). There are also plugins which add responsive design shortcode capabilities, but in general, shortcodes should be avoided in your CMS (another topic for another post). So we’re stuck with our maxim:

It’s far often safer, cleaner, and more mobile friendly to keep your website layout and formatting as simple as you can, using only the simple tools provided with the default WordPress editor.

When you do need something a little more complicated, you can always shoot off a quick email to your friendly responsive web developer.

WYS is not always WYG.

A common theme running through this post is that, when providing content within a CMS, what you see is not always what you get. The Visual editor in WordPress is supposed to give you a preview of how your post is going to look on your website. It doesn’t. Surprised?

There is not much worse for an office administrator / web content creator than spending a great deal of time writing and formatting some new blog post or web page content, only to find out that it looks totally different on the website after it is published.

The main reason for this is that your backend admin is treated like a separate domain with its own formatting. It’s actually a separate theme file, with separate CSS for your fonts, headers, weights, spacing, and so on. That’s what the editor is based on.

If it were a little smarter, it could grab your theme CSS and apply that within the editor window. But it’s not exactly that simple. It would essentially need to read and rewrite your theme CSS just for application within that editor window. Not impossible for a developer to create, but certainly not a default feature.

When you are editing your CMS content, except for the very general basics like distinguishing between paragraphs, and inserting hyperlinks and images, you are basically formatting blind. Again, you need to keep your formatting, style, and other design elements to a minimum.

In WordPress, the best preview is in the Publish widget on the right. The Preview button there should open the page up in a new browser tab for you to preview what it will look like to the visitor. If it’s not turning out how you would like, it’s better to modify the theme’s CSS to get it to do what you want. And if you aren’t familiar with CSS, it’s better you hire a web designer or developer to do it for you. Those are very simple and inexpensive theme edits.

WordPress preprocessing quirks.

Not only is the visual editor not providing an accurate preview of your content design, layout and formatting, WordPress is also modifying your content formatting prior to the theme’s display of it. What’s worse is that WordPress gives almost no administrator settings to control how it filters the editor’s content. There are a few settings which can be changed within the core configuration files. But you usually don’t want to touch those. Basically, the only solution is to be aware of WordPress’ preprocessing quirks and live with them.

What to do when your published paragraphs aren’t showing as paragraphs.

As an example of preprocessing, WordPress automatically adds paragraphs to newlines in its editor content. That means, even if you switch to the Text (HTML) view, you won’t usually see paragraph <p> tags. If this is the case, it’s most likely because a hidden variable wpautop is turned off by something. I ran across this problem recently with my Yootheme. The theme setting, by default, disabled wpautop. But, when disabled, it doesn’t automatically start showing paragraph tags in the WordPress editor. Enabling wpautop in the theme settings fixed the issue.

Your theme controls your formatting.

As I’ve already explained, the way your content displays on your website goes through your theme styles. These don’t show in the WordPress editor, even in Visual mode. That means that WYS in the Visual editor is not exactly WYG on the published website. As a result, you need to constantly Preview your content with the built in Preview button, or use a separate tab in your browser to switch back and forth between the backend editor and the website in the frontend.
Depending on your theme, you can change the styles, such as the font family and size, in the theme settings. If not, you will need your web developer to create or edit those CSS files for you. Using CSS styles is the proper way to handle HTML formatting. Formatting in your editor is not a good work-around for formatting issues. It’s important to understand that the majority of formatting goes beyond what should be accomplished within the editor.

High Expectations.

The root of layout and formatting issues in the CMS can be summarized as content editors expecting too much of the CMS and their ability to control how their content is displayed. Typically Office users see the WordPress editor buttons and think they are the same as Office documents. A great step in the right direction to avoid these problems is to learn the limitations of the CMS, its editor, and website design in general. It’s really a different beast than word processing for print or press.

If you insist on doing your content updates yourself instead of hiring a developer, and you don’t want to learn the HTML, CSS, and how your CMS and editor deals with those, as well as the standard implementations for responsive mobile friendly design and SEO, you’re best strategy is to keep your formatting as simple as possible.

I understand that CMS’s have been oversold as simple, do it yourself, website creation tools. They are if your expectations are simple. As soon as you add any little bit of complexity, along with a lack of website architecture and development techniques, a CMS starts to go beyond a “do it yourself” website builder. That’s partly the reason for the recent popularity in DIY website builders like Wix (a topic for another post).

The solution is to lower one’s expectations of how much they can do themselves in their CMS. CMS’s are still great for organizing, managing and formatting various content assets. They are really powerful tools in the hands of developers, and they radically speed up development time compared with the old custom built website methods. But they don’t yet put all the power in the hands of the layperson.

WordPress editor developer specific notes.

These might read as a list of my WordPress pet peeves compared with other CMS’s like Joomla. CMS’s like Joomla and Drupal are targeted more towards website developers. So it is sometimes frustrating and more time consuming for a website developer when using a dumbed-down CMS like WordPress. Under the hood, at the API level, WordPress is arguably just as powerful as any CMS. But it’s not always available out of the box or even with third party plugins. I’ve found this especially true when it comes to the default editor (as well as the available editor plugins).

No administrator editor settings.

I find it shocking that the WordPress developers chose not to add Editor settings to the Settings menu in the WP-admin. TinyMCE itself bundles with over 40 features and settings. It’s also Open Source, which means it’s easy to customize and extend. I understand the reasons for pairing it down for the regular user. Content editors shouldn’t be given permissions to change administrator settings as well. But for the website administrators, shouldn’t they have access to the editor configuration settings?

WordPress does give us access, but only programmatically. You can manually change the default configuration file in wp-includes/class-wp-editor.php, but that’s not an update safe method to preserve your changes. WordPress does provide a method to hook a custom editor profile through either your theme functions.php file or within the template files themselves, but these are not theme update proof either. Within the WP-Admin, WordPress gives you access to many of its files through Appearance->Edit, but most of these files shouldn’t be changed either. It’s really a mystery why WordPress would include access to them when it doesn’t include access to the editor configuration. In fact, the editor configuration file isn’t even in Appearance->Edit.

The proper way for a developer to make changes to the WordPress core is by either (a) altering a copy of the functions.php within a child theme or (b) creating a custom plugin. Child themes should be preserved through theme updates, so they are the recommended method for making any theme changes, including those overriding the WP core configurations. Themes or frameworks sometimes have their own method for creating child themes which differ from the standard WordPress method, but it should be pretty straight forward whichever theme you are using.

A child theme is great if your changes are going to be theme dependent. But what if you change themes in the future? You, or whoever inherits the web development of the site will need to remember to copy over all the important modifications. A better, theme independent method is to create a custom plugin.

Fortunately, in this case, there is already a plugin available which makes the default WordPress editor configuration available within the backend WP-Admin: Advanced TinyMCE Configuration.

If the developer keeps it up to date with WordPress’ TinyMCE updates, this plugin should do the trick. It’s not super user friendly, requiring one to learn the setting names and values from the TinyMCE docs. It’s not very “Wordpressy”. And it still surprises me administrator access to these settings isn’t a core feature of WordPress as it is in Joomla.

Switching from Text to Visual modes modifies custom HTML.

When switching between Text and Visual modes in the TinyMCE editor in WordPress, sometimes HTML code is rearranged, reformatted, or even removed. This is because WordPress runs Visual mode through some HTML cleaning features. I’m not sure why they chose to do that since, in Visual mode, you aren’t editing HTML anyway. Perhaps it’s intended to clean up bad tags from a copy and paste. But it’s really annoying if you want to primarily or occasionally work with the HTML, then switch back to Visual to check it or add images.
This has been an issue within WordPress since it started using TinyMCE. It’s not a TinyMCE issue, however, as many of the forums suggest. Joomla’s default editor is TinyMCE as well, and it doesn’t have the same issue switching between code view and WYSIWYG view. This is one of the main reasons I’ve always favoured Joomla over WordPress.

What WordPress has done to address the issue is to put a user option to not use the Visual editor. That makes the Text mode of the editor default. It’s not enough, however. Sometimes I want to switch to Visual mode to make updates quicker than looking through the HTML. And there is nothing to prevent other users from opening and saving the post. If they default to Visual mode, then I could lose my custom HTML. It’s not irregular for multiple editors with different skills to work on the same article.

The problem is even worse for me as a developer when my customer wants me to fix their content to make it more responsive. In order to do that, I need to use custom HTML which WordPress might not approve. What we need is some method to tell WordPress not to clean the HTML, even in Visual mode.

Many of the WordPress forums discussing this issue (going back over 11 years) claim that this is an issue with TinyMCE and not WordPress. And again, I am telling you it is not. I’ve read the entire configuration docs for TinyMCE, and there is no setting for auto-formatting the HTML. This is a core addon from WordPress. I haven’t spent the time to narrow it down to a hack yet. But that’s certainly on my to-do list. The bottom line is that editing or hooking into the class-wp-editor.php isn’t going to help here.

Browsing through 800+ editor plugins in the WordPress Plugin Directory, I found three plugins which purport to address this longstanding need. The first, preserved html editor markup plus, hasn’t been updated in 4 years and has a pretty strong installation warning. I’ll have to sandbox that one in a fresh local WordPress install to test it. Being that old, however, I don’t have much hope for it working as is. I’m curious to browse through its code ‘though to find out what method it used. The second one, preserve code formatting, isn’t going to work because it doesn’t address the reformatting issue caused by switching to Visual mode. In its description, it actually recommends never switching to Visual mode, which certainly is not going to work for our clients. I had the most hopes for the third plugin I tested, dont muck my markup. However, I failed to see any difference. I still found that the code in Text mode was being reformatted when I switched to Visual mode.

Without a WordPress recommended method, and no seemingly working plugin for it, it looks like we have to resort to using a different, more code friendly, editor. Of the many editors available, I found that the most popular, TinyMCE Advanced, did the best job of preserving my custom HTML even when switching to visual mode. It’s not perfect, and for client sites I’d want to trim down the Visual editor buttons to just those which are safe for the client to use. But it looks like TinyMCE Advanced provides the only available alternative to WordPress’ quirky default editor behavior, allowing for content editing at the different levels as desired.

No ‘Code Editor’ and ‘No Editor’ options.

In Joomla, I am used to having the option to use the strictly code editor CodeMirror (with syntax highlighting) or even no editor at all (plain text mode like the file editor in WordPress). Joomla has several excellent code only editors with syntax highlighting to choose from. Sometimes I prefer to not use any editor at all. That’s when disabling all editors is useful.
While there are many WordPress plugins which provide syntax highlighting for the file editor (for editing theme and plugin files), I didn’t find any specifically for the content editor. And none were strictly for code editing with no visual mode, replacing and bypassing TinyMCE all together. Even more surprising, there appears to be no way to turn off the editor for posts and pages so the editor becomes like the default one for files (ie, no editor).

This really demonstrates to me how different the audiences are for WordPress and Joomla. Developers for WordPress are more focussed on the lowest skilled user, and not other developers. In Joomla, the developer community seems tighter to me, and the plugins seem to me to be more developed for developers.

I’m not going to argue that either is better since they are different niches. However, it would be nice to have a strictly development mode available in WordPress when I have a client who does not intend to update their own content, or when I need to get in there and do something really quickly without all the extra “user friendly” frills.

Can’t Set Default Editor by User, Role, or Page.

Another feature of Joomla I seriously miss in WordPress is the ability to set the default editor by user. That allows me to set my code editor, or no editor, as a default, while providing the client with a more user friendly visual editor. This could be set in WordPress in the User settings, or in a role management setting, or even on the page being edited (to make it easy to switch between multiple editors while working on some post). The lack of good editors for WordPress, and any developer specific editors, makes this issue a somewhat non-issue at the moment. A minimal plugin could at least disable all editors, however. That would certainly be preferable to nothing.

Developer Notes Summary.

Compared with Joomla, the choice of available editors, access to editor options, and ability to switch or disable editors by user is a serious impediment WordPress attracting developers to build websites in WordPress. Fortunately for WordPress, it is popular enough that clients who don’t understand these issues request WordPress based websites. That doesn’t help us developers however. For the moment, we seem to be stuck with TinyMCE Advanced.

On the upside, the editor issues and gaps provide a niche for plugin developers. I’m seriously considering spending some time to tackle some of these issues. I’m not sure how popular the solutions would be, however. Most of the editor plugins seem to target the wider unskilled user base. There is a quickly growing interest for drag and drop block builders to replace the visual builders in WordPress now that they are commonplace in DIY website builders like Wix and bulk emailer apps like Mailchimp. But there’s no reason you can’t have both. And there’s no reason they can’t be selected by user preference, or even on the page being edited as the user finds need.

Continue Reading No Comments

Best SEO Strategies and Practices for 2016 and beyond

SEO-Strategies-for-a-Successful-2016

There have been quite a few major changes over the years with how search engines rank websites, pages, and content. In this article, I briefly survey the most significant changes made by Google. The emphasis will be the recent Google updates affecting SEO and the best strategies for dealing with them.

In the Beginning there was PageRank

In the beginning (of the internet), the internet was without form and void. Unless you had a specific address for the content you wanted to load, there was nothing to “browse”. There was no method to search across addresses for content relevant to your research interests. In fact, the internet was not used for research at all.

I fondly recall doing my university research in the 1990’s. All research was done in the libraries. The libraries did have a searchable indexed digital catalogue for their books by title, author, year, synopsis, and keywords. For journals, it was a little bit different, we had microfiche and microfiche scanners. Searching through academic journals was essentially analogue and took many hours, days, or even weeks to find enough relevant material.

There was a shortcut, however, to wading through the microfiche of millions of journal articles. Nearly every well regarded textbook in your field referred to the most significant journal entries. So if you wanted to do research on a topic beyond the introductory textbook information in a chapter, all you had to do was jump to the bibliography. And bang! There’s a list of mostly journal articles to start your research. Spend a long day in the library, and about $40 in photocopying, and you could come home with every primary article source for that topic. After a week or so of reading, you could go back with a secondary list of sources compiled from the bibliographies of the relevant first sources. And so on, until your day of rest, usually the day before the day the research paper is due (the very last day reserved for writing!).

If you think about it, and especially if you have a similar academic experience in a research based discipline, the challenges in finding the relevant information were more than just analogous to the development of a search engine for the internet. The challenges and existing solutions were both the context and inspiration for Google’s early search criteria.

PageRank, named after Larry Page, both an academic and a founder of Google Search, is a criterion for ranking the authority of pages based on which and how many other pages refer to them. It’s a method for mathematically discovering the primary, secondary, tertiary, and subsequent ariness of sources. If everyone in a field refers to some seminal journal article, we should assume that that is an essential primary source of highest quality, authority, and trustworthiness. PageRank is basically the same method we used as an academic research shortcut to the currently inefficient and time consuming methods provided by the library. Except the concept was to index the entire internet and apply this shortcut to that.

All this assumes sufficient similarity between academic content and internet content well before anyone could predict exactly how, or how diverse, the internet could become, both in content and use. The early implementation of PageRank also used other commonly accessible journal article attributes such as the title, author, keywords, source (an URL instead of journal name and number) and abstract (the page description).

These worked pretty well for the first decade or so, well enough to make Google the dominant search engine for the internet. But they weren’t without problems. And as the internet evolved, so did the challenges Google faced in indexing and ranking searches. A dominant theme throughout is relevancy. How does Google rank and present the most relevant results for a given query? We will continue to explore this theme as we get to the best SEO strategies and practices for 2016.

Problems in Paradise (and the Rise of Censorship)

Or How Not to do SEO

While Google built one of the most profitable tech empires, partly based on Google Search, and by far the largest internet database and search engine in the world, it was not without problems. The goal was to provide the most relevant search algorithm for content on the internet. But their model for content was academic journals with standard library search parameters. There was one significant difference between internet content and academic journal articles. Academic journal articles were already peer reviewed for content and quality prior to publication. The internet was an open space for content of any quality and topic.

This big difference created big problems for Google Search and motivated their revisions over their second decade. All their revisions involved increased censorship with the intention of quality management similar to the peer reviewed management of academic journal publications. But there was no possible way Google could be an expert in every possible topic for content on the internet. So while Google started publicly trading their stock options, mostly driven by Adwords revenue, their search engine suffered abuse after abuse, creating a decade where SEO was more about exploiting Google’s ranking deficiencies then providing genuine quality content.

Bogus Backlink Exploits

You probably still see spam email telling you that someone is willing to provide hundreds of quality backlinks to your website for almost nothing. This is a legacy of the most common Google exploit during this period. PageRank made backlinks the most important method for improving page position in a Google search. For a time, you could effectively fool the Googlebot into thinking that your site was the most referenced on the internet for a specific topic. If you recall my explanation of academic research using journal article references and sources, that would have worked for internet pages as well, but only if they were already peer reviewed. To combat fraudulent use of backlinks, Google took a two pronged approach.

First, Google modified how they determined the “quality” of the source of a backlink. Previously, as with academic references, quality is merely a quantitative measure of the relative number of references to a source. In Google rankings, backlinks from references with higher PageRanks were higher quality backlinks. But now, backlink quality had to be determined somehow from the source itself, apart from PageRank. A backlink from Apple.com, for example, had to mean more than from some unknown blogger website. It’s not generally known exactly how they changed their backlink quality assessment. Because of the proliferation of exploits, many of Google’s updates were kept a secret. It could have been as simple as manually creating a database ranking the top sources in different industries and sectors. What is clear is that they changed something because this sort of backlink exploit stopped working.

Anybody who tells you they can get your website into the top page of a search just by generating backlinks for you is selling you a scam. You should immediately junk it as spam. Don’t waste your money.

Second, Google censored suspicious targets of excessive backlinks. Again, their exact methods were very secretive. However, what was clear was that if an unknown website suddenly got a large number of backlinks, it could be flagged for manual review. Once reviewed, the website and all its associated domains could be blacklisted from Google Search. There was no algorithm for blacklisting, and it was totally up to the judgement of someone at Google.

During this time, there was a lot of buzz throughout the SEO community about “negative SEO”. As SEO professionals, we were worried that malicious attacks could spam backlinks to our client websites, flag them, and shut them down. This would be entirely out of our control, since we can’t control who puts a link on their website pointing to ours (outside of manually blocking those references one by one by URL or IP server-side, which would be loads of manual labour).

When Google addressed this issue, they denied using negative pointing in their ranking algorithm, but not blacklisting domains. They did say, however, that the number of domains they had to blacklist was extremely small. Basically, they had to have enough evidence that the same domain which was getting the excessive backlinks was also the one generating them. “Negative SEO” was a myth in that sense. But it was also totally up to the judgement of Google. Google was doing source-checking the same way an academic peer reviewer would of an article being reviewed.

Backlink exploits aren’t only scams, they can be viciously harmful to your website’s search position. They can even get your domain blacklisted by Google Search. But only if they have sufficient reason to believe the exploit was intentional.

Since then, backlinks and PageRank have been consistently decreased in their ranking priority. There is good reason to believe that eventually they will be removed altogether, and that Google is beta testing version of their ranking algorithm without it. As I will explain later, other factors, such as the shift toward social media interaction, seem to have a greater effect. These shifts better reflect how non-academic word-of-mouth references work in the business and commercial world to peer review content and products.

URL Abuse and Domain Name Parking

Google used to prioritize keywords in domain names and URLS, even over keywords in content. I drew the analogy earlier between URLS and journal sources. Academic journals tend to include their topic in their name. We know that “The American Journal of Physics”, for example, is going to contain articles about physics. In the same way, people were advised to purchase domain names which were descriptive of their content. A plumbing company named Grunge would be advised to get a domain name like grungeplumbing.com. While this might be a good idea in general, there are exceptions (“Google.com”, for example, isn’t “googlesearch.com”). And the emphasis on descriptive domain names led to a virtual real estate submarket where companies would register and hoard domain names with descriptive content, jump on registration of expired domain names, and pick up domain names with similar names or even misspellings of existing domain names. All in hopes of later reselling them at premium prices.

This practice also led companies to buy multiple domain names related to their primary domain. The fear was that someone could use a related domain name and quickly outrank yours. Google basically ignored this practice and fear throughout this period. They probably hoped that their other modifications would outweigh the abuses. Eventually, however, like in the last 5 years, domain names and urls have decreased as a factor in ranking. Today, they are virtually non-existent as a factor. But for other reasons I will explain later.

Don’t worry about your domain name in terms of SEO. Get something easy to remember. And you only need one. Your content will speak for itself.

Keyword Krikee

Keywords were probably the most commonly understood SEO element. When someone searches an engine, they try to guess the keywords which best match what they are searching for. So on your webpage, you would do the same in reverse, trying to guess the keywords which users searching for you would guess best matched your content. It’s sort of like the SEO version of the television game of Family Feud. But, without any body of peers reviewing your keywords, keywords were also the simplest criterion to abuse.

Instead of trying to accurately describe your page content with a few keywords, SEO hackers would try to use common keywords and phrases used to boost your page position in a wider range of searches. Or they would attempt to find niche keywords or phrases which had high queries but a low number of results. This is all done apart from, and often unrelated to, the page content.

Again, for a long time, Google did almost nothing to combat keyword abuses. Again, they probably hoped that by adding other factors, the harm of keyword abuses would be diminished. Eventually, they shifted toward prioritizing keywords in content. What that meant was that they expected you to put your most valuable keywords in the most prominent content, such as in your titles and introductory paragraphs, along with the repetition of your most desired keywords. This practice paved the way for the SEO analysis of keyword density whereby we measured which words and phrases were used how many times and compared it with Google’s Webmaster Tools to the actual queries used to give your page impressions in a Google Search. Google banked so heavily on keywords in content and keyword density that they eventually completely stopped reading the keywords meta tag (which was the original primary source for keywords just like in academic journal articles).

Neither keywords in content nor keyword density were sufficiently secured from abuse, however. Exploits included hidden keywords in content and misrepresentative or irrelevant keywords scattered throughout the content, visible or hidden, meant to artificially target high value queries. Businesses themselves were encouraged to have two layers of content, one for the search engines, and one for their actual potential visitors.

I believe Google still hasn’t resolved this issue based on the evidence that their latest series of updates have primarily targeted these issues. Keyword density, which used to be a positive measure, is now considered part of keyword stuffing, which is negative. Since these were common practices, and their latest series of minor updates were very quiet with their releases, content revisions conforming to Google’s latest best practices should be considered a top SEO priority. We’ll discuss these revisions later when we look specifically at the best SEO strategies and practices for 2016.

It is against Google policy to hide keywords in your content with the intent to mislead Google and misrepresent your content. “Keyword stuffing” is also against their policy. Content revisions conforming to Google’s latest best practices should be considered a top SEO priority.

Best SEO Strategies and Practices for 2016 and beyond

I hope that having a broader understanding of the historical context of Google’s Search rankings provides a better understanding of the changing role of SEO going forward. The challenge for Google has been translating their search engine from a tool to find academic library content to uncensored, unreviewed, real world content on the internet, and to make it relevant to businesses and commerce without overbearing central censorship. I’ll review here their most recent changes and try to put it within this context so we have a long term strategy for developing best SEO practices going forward from 2016 and beyond.

Localization Makes Everyone a Winner

There’s one big difference between searching for academic journals articles and searching for online businesses which I haven’t yet discussed. Many, if not most, businesses are local. That means they have a location (even if they distribute to a wide number of locations). For SEO, that means that you don’t need to be in the top ranking listings globally for a specific query. You just need to place top within your locality, which might include your closest local competitors.

SEO is like an auction in that it is a bidding war. Given a group of competing businesses, and given equal efficiency on their SEO expenditures, all else being equal, the one who spends the most is going to get the best positions in a Google search. If that were true also over a global distribution, no one except the massive multi-national brand empires would ever get good placement on a Google search. So one of the solutions Google worked very hard on (while others were working on social media), was localization of search queries. This resulted in a number of different, but later connected, Google technologies.

The biggest winner for Google in the localization space was Google Maps. Google Maps has become as or more ubiquitous than Google Search. High tech growth business like Uber rely on the GPS location data provided by Google Maps. Google Maps doesn’t only include people searching the map for a location, it also includes all the background API calls to Google Search from web and mobile apps which want to integrate location data about a user or content query. Many times you don’t even see these calls being made. Google Maps also allows users to add business and other location specific information, media, and other content. This additional content is also available to apps through the API. Ads can also be used to target location specific content.

Within Google Search results, the top ten local maps listings are often provided at the top of the Google Search listings (right under the top 3 Adwords paid advertisements). Businesses can make use of localization for SEO just by having their business listed in Google Maps. Most recently, Google has launched My Business as a central manager for both your business page(s) in Google+ and your Google Maps listing(s).

I believe we are just see the beginning of the uses and effects of localization. Localization could potentially be used for highly personalized and targeted marketing campaigns. An app, for example, could allow a business to tailor a store special based on a user’s buying patterns, then offer that special just as the user is walking or driving within a specified proximity of the business location. With the addition of tracked devices in the Internet of Things (such as pet tracking collars), the localization possibilities explode even further.

Here’s a list of my top localization strategies for 2016 and beyond:

  1. Get listed on Google My Business. By adding a location there, you will also be listed on Google Maps. If you already have a location listing on Google Maps, make sure that location shows up in My Business, otherwise you might end up creating duplicate listings which are both bad for your SEO and really difficult to fix after. Tip: make sure you use the same Google account email to login to all your Google services. If you need to have more than one account access these services, learn how to add a user account to the particular service.
  2. Make your Google Map listing stand out by adding images and business information. Don’t forget to add your website address! This helps the user and Google connect your business location with your website and better localize searches for your content.
  3. Get your web developer or SEO expert to add location structured-data tags to highlight the location information on your website. Most websites put their location information in their footer. That helps Google localize you for searches. But you can explicitly highlight that data for your Google and your search listing by adding meta tags around that information. Google integrates structured data as defined by http://schema.org/.
  4. Using Google’s Webmaster Tools, your SEO expert can help you localize your content for common related search queries in your area. A sound SEO strategy is to optimize your content for queries in the shortest optimal radius to your location (roughly 5-10km). Once your average position and click through rate (CTR) is high in a narrow radius, that radius can be expanded with little effort. For SEO, narrowing your target first is a much more effective strategy than casting a broad net to see what catches.

The Rise of Social Media Marketing

Many technology and communications experts believe that Google completely missed predicting the rise of social media giants. Some think it was because Google is a geeky academic computer science company, not a cool hip company like Facebook. Others believe that their focus has always been on data, not on social interaction, so social media entrance wasn’t a good fit for them. I believe that they probably predicted the size of the data available in a social media space like Facebook, but didn’t see it as core to their business model, for whatever reason.

Officially, Google has claimed that social media activity has no direct influence on Google Search. The keyword there is direct. Other SEO experts like those at SearchMetrics and KissMetrics have both reported that their tests show there is some influence between social media activity and page rankings. This may be indirect based on other Google factors. For example, Google might have shifted their backlinks ranking to include backlinks from social media posts. This would make sense since social media references to a page are a much more natural representation of word-of-mouth recommendations. Plus, social media is already semi-censored and peer reviewed.

Of all the social media, even though Facebook is by far the most used, Google+ clearly provides the most effects for SEO gain. Again, these are likely indirect. It’s not so much that Google explicitly trying to promote Google+ through Google Search rankings. Rather, Google’s integration of their services provides tools to help you rank higher. For example, Google+ Pages provide a means for user feedback, ratings, and reviews. If you have connected the services correctly in My Business, these Google+ Page ratings and reviews will show up both in your Google Maps and Google Search listings. Not only will they show up, but they will factor into the position of the listing over those which haven’t been reviewed. Following my analogy to peer reviewed journal articles, these ratings and reviews are a strong and common replacement for academic peer reviews in the non-academic business and commercial world. Consumers, especially Millennials, rarely make a purchase online or step into a new store without first consulting reviews. Google is merely utilizing and patterning this behavior within its search results while phasing out the less personalized method of backlinks and PageRank. The integration and use of Google’s services to demonstrate the quality of your products or services is a highly effective SEO strategy for 2016 and going forward.

By sheer volume and engagement of users, Facebook is probably the most important social media platform for marketing. For SEO, it is a close second to Google+ according to recent tests. To understand the SEO benefits of Facebook activity, first understand that Facebook links back to your website count as backlinks. They might also count as quality backlinks since Facebook content has some censorship and peer review characteristics. While anyone can post nearly anything, if your peers don’t like it, they aren’t going to repost it. And Likes of content increase the number of feeds given to the post. So the increased number of impressions in feeds really does represent a sort of general consensus about the quality of that post. You might disagree if you’ve seen the kinds of posts which get massive likes and reposts! But that is purely personal opinion. “Objective” quality in social media is equivalent to popularity. Peers need not be highly educated for their reviews to count as it does for academic journals! By embracing this difference, we can understand and use popularity as socially authoritative of quality content on the internet. This is a tough one for Gen Xers and older to understand being that culturally, we tend to be from a pre-populist pre-internet pro-education culture in which much of what is popular on the internet appears to be “objectively” garbage to us by our standards. If this is your problem too, the trick to success in the social media space, I think, is lowering your standards, dumbing down content, and appealing to the lowest common denominator instead of the higher educated specialists. Of course, this totally depends on the sort of content you are trying to promote.

I’m not a social media expert by professional standards. I don’t even enjoy most social media activity for personal use. But I understand it enough to recognize its cross-over implications for SEO and digital marketing. The biggest advantage of social media is that it changes the way we access information. Instead of actively having to search for everything we want, social media provides passive “feeds” of information intended to be relevant to our interests and behavior. Feeds provide the opportunity to actively present yourself to potential audiences rather than passively wait for them to find us. Social media puts the onus on the content provider to be seen and heard rather than on the consumer to find what they are looking for.

Hiring a social marketing specialist, or using a professional social media manager like Hubspot, can be expensive and often out of the reach of small to medium sized businesses. Even using a free social media management service like Hootsuite can be overwhelming and time-consuming to many. The trick is understanding what you can do effectively in-house without expending excessive time and resources. If you don’t think you will be investing heavily in social media for your digital marketing, I would at least consider in investing in it enough to help your SEO.

A quick and easy example of a social media workflow for SEO that I recommend goes something as follows:

  1. Publish blog posts on your website regularly (how regularly is up to your available time and content). A blog can be considered your starting point and manager for your outgoing social media posts. Typical of press releases and newsletters, a blog post can highlight an aspect of your business. It’s a really good idea to provide links from a post both to other related pages on your website as well as your contact page. It’s also a good idea to have a subscription call to action which can collect interested viewers contact information and notify them when you release new blog posts.
  2. Auto-publish your post to your social media pages. If you are using a CMS, there are plugins to help auto-publish your post to your social media pages. The plugin should format your post nicely for the social feed using Facebook’s Open Graph protocol (this protocol is widely accepted beyond Facebook for formatting page data nicely within a social post). The plugin will also provide a “read more” link back to your post on your website. That little “read more” link is the first step to creating the desired backlink and social authority for your post page.
  3. Use social sharing on the post to facilitate re-posts and likes. Again, if you’re using a CMS, there are plugins to assist social sharing. Social sharing provides the means to increase the reach of your post. Increasing the reach of your post is the second step to building the social authority and backlinks to your post page.

You might additional use social media marketing tools to boost the reach of your post. Facebook has a “boost” feature available on feed content. But even if you don’t spend advertising money on social media, if you follow the steps above regularly, using your blog you can easily and quickly increase the quality traffic to your website and, indirectly, boost your page position in Google Search.

Mobile Devices, Multiple Devices, and the Internet of Things

For about two decades, internet access was basically restricted to desktop computers (there are a few technical exceptions I won’t go into here). Only relatively recently has computational power gotten cheap and small enough to provide internet access on other devices such as mobile phones and tablets. Mobile devices, however, have had a huge impact on the way we use the internet.

The internet is not just something which is searched in a browser. Apps can access it, reprocess it, and represent it in niche specific ways. Users can browse or use apps across multiple devices. And, increasingly, apps will be connected to devices with very specific functions and no interface themselves (recall my GPS pet tracker collar example). The proliferation of devices and uses of the internet provided Google some major challenges to indexing and ranking content since we’re not just talking about website pages anymore. (I was just recently browsing the Samsung website. While they put smart technology into all their electronics and appliances now, they have also created a category for SmartThings which contains, for now, Smart Home sensors, hubs, and outlets which can all be controlled with your mobile phone.)

For some time, while Google worked on the challenges of tracking the same user across multiple devices given content variations on different devices, Google Search kept mobile and desktop search traffic separate. That meant that you could rank high on one but not the other. The same search on one could have radically different results than a search on the other. That was until the highly publicized update on April 21, 2015.

What you probably heard about that update was that you better get your website mobile friendly or your website would be penalized and drop in rankings. I provided our clients at Allegra with a similar notification in my post, Is it time to upgrade your website for mobile traffic?. What Google promised to do was (a) test for mobile friendliness, (b) penalize pages which did not pass their test, (c) and merge mobile and desktop search into a single index.

Since Google implemented their Mobile Friendly Test into their page ranking system, I’ve seen an average session drop of 20% in client websites which have not updated their website for mobile traffic. This drop has entirely been from organic search traffic from Google. This update is real, and it has a major effect on traffic to websites which do not pass Google’s mobile friendly test.

The excuse that your client base does not use mobile phones to browse your website isn’t a good excuse not to update your website anymore. Failing to be mobile friendly and responsive makes it harder for desktop clients to find you in Google Search as well. And if you’re like most websites, users from Google Search probably accounts for about 50% of your new traffic.

Content Killed the Keyword

Over the last year, since the latest big Google update, I began noticing a small drop in traffic on some client websites even though we had made them ready for the mobile friendly test with completely responsive design and high scores on mobile related PageSpeed Insights. This was puzzling me. I retested the websites and still found no mobile related issues. So I began digging and do some research.

It turns out that behind the big fan-fare of mobile readiness was another really important update to the way Google reads content. I’ve already mentioned that Google stopped reading keywords in the keywords meta-tag. Since then, all SEO advice was to embed keywords in your content, particularly in the significant markup like titles (and even bolded tags). We all measured keyword density, the number of times a word or word phrase is used throughout the page. And our strategy was to make the keyword density accurately reflect both the intended page content and also the search queries being used to find that content. Well, that’s all changed again!

I’m telling you today, it’s confirmed. Google has dropped keywords entirely. That includes keyword density. Matching keywords to search phrases is dead.

Google had been cautioning against “keyword stuffing” for a long time. For SEO, this meant not making our keywords too dense (ie., not too many repetitions). But now it appears that any keyword density is considered “keyword stuffing”. Without warning, and hidden behind their big update announcements, was a major change to the way Google read page content.

So what has Google replaced keywords with?

That’s a good question, and it’s something we’re still investigating. The short answer is “content”. But that’s not very helpful is it? It’s not even entirely accurate. The almost as short answer is “quality content” with the emphasis on quality. Apparently, there have been a number of subsequent minor updates to refine or correct their quality assessment. From what we can tell so far, quality content includes:

  1. Uniqueness. The uniqueness of the content includes having a unique angle on some topic which sets it apart from similar competitor pages. It also means using synonyms instead of repetitions. You can repeat the same idea as much as you like, just as long as you do it using different words, and present it within a unique context. Generic phrases, especially meaningless ones or context dependent ones, will not help and might hinder your content’s uniqueness score.
  2. Readability. There are some SEO experts who believe that Google is using the Flesch-Kincaid readability tests to measure readability. These tests basically score the reading difficulty of the writing and assign it a grade level. The ideal writing level, apparently, is about Grade 9 (although this might be different for different categories of content). This is measured by a combination of the average word length of the sentences and syllable length of the words. For most people, this just means writing naturally, as one speaks, avoiding technical jargon and big words as much as possible. Spelling and grammar might also be factored into readability, but I haven’t seen this confirmed yet.
  3. Content length. It doesn’t appear there is any minimum or maximum length for content. The ideal length of the content seems to depend on the topic and how much unique content you can contribute to it. A contact page, for example, doesn’t need loads of extra information weakening the point of that page, to get contact information or submit a form. Homepages, however, which have only a slider, menu, and footer, might be too small. I’ve done several tests with those, changing them instead to more of a “one page” scrolling style with several sections describing their main product or service categories. All of my tests have shown an improvement in the amount of organic Google Search traffic, as well as engagement with the content on the website (less bounces, more pages view, and higher conversion rates). The idea here is to have just enough content on the page to fully describe what the page is about, and not more (although I haven’t seen any pages get penalized yet for too much content). The main problem on most website is that the content is too short.
  4. Content organization. This aspect hasn’t changed. A well written page will be organized well by topic using title headers. Paragraphs will flow well from an introductory paragraph to a closing paragraph. Asides, advertisements, forms, and anything not significant to the topic should be labeled as such so the Googlebot does not read it as part of the content.
  5. Remove everything keywordy. Especially keywords hidden behind things, in tags, in images, or anywhere not visible by the user. But this also means re-reading your page content to look for anything which might pop out and get flagged as a keyword. This sounds counter-intuitive for those of us who have been doing SEO for a long time now since keywords have been a central aspect of our SEO strategies.

I think Google’s general move I think is that content should be user driven, not bot driven. Every SEO exploit to date has used an aspect of Google’s assessment which has been disconnected from actual visitor use. The more human Google makes their bot, the less easy it will be to exploit and abuse. SEO should be about meeting human usability conditions, not about satisfying or hacking search engine criteria.

So how does Google understand the content without using keywords?

The short answer is that we don’t entirely know. I’ve given some suggestions above for some tested tactics for meeting Google’s content requirements. But that doesn’t tell us anything about the actual engine or algorithm driving Google’s content scoring. For that, I can merely provide an educated speculation.

The key, I believe, is to understand where Google has been spending their research money. One of their huge recent acquisitions has been DeepMind, a machine learning program in the sector of Artificial Intelligence. In just the last year, DeepMind was able to win against a professional level Go player, a feat that was thought many years away for AI. Google is not the only large company investing in AI. Other companies include Microsoft, IBM, and Facebook. While it’s highly unlikely that DeepMind is being used in Google Search yet, it’s very conceivable that sometime, in the near future, it will be. What’s presently likely is that the Googlebot is using very complex semantic processing, possibly with some machine learning components to assist how it understands content.

In a recent post on Google’s Official Blog, the new CEO of Google, Sundar Pichai writes:

Looking to the future, the next big step will be for the very concept of the “device” to fade away. Over time, the computer itself—whatever its form factor—will be an intelligent assistant helping you through your day. We will move from mobile first to an AI first world. From This year’s Founders’ Letter.

What that means for SEO is that it will get increasingly difficult to “beat” the Googlebot with exploits to get to the top of Google Search. More than ever, we need to focus our SEO efforts on providing authentic quality content and genuine usability for relevant human visitors. We measure these in Google Analytics by engagement metrics like bounce rate, pageviews per session, and goal conversions. We test these with A/B Split Testing methods using Google Content Experiments. By improving the performance of our websites for human traffic, we help teach the Googlebot to interact with our websites more like a human. Google ultimately wants to score our webpages as real humans would, and they have access to all the metrics by which to measure their success. And so do we.

Continue Reading No Comments

Digital Marketing Essential 2016

The many fields of digital marketing.

Over the last 5 years, I’ve written quite a bit about digital marketing, both from the perspective of industry research as well as my professional experience doing digital marketing. “Digital marketing” is a bit of a buzz phrase, and there is a little bit of misunderstanding what it is, how it works, and how to use in one’s own strategic business marketing plan. Since I am starting a fresh blog, in this post I will summarize what industry experts mean by “digital marketing”, what it is used for, and which essential tools we use.

 

What is “digital marketing”?

Digital marketing is the process of planning, implementation, measuring, reporting and analyzing the results of your marketing efforts using digital technology. Digital marketing methods can be applied to traditional (offline) and new (online) media. Digital marketing is the only method by which you can measure results based marketing strategies and goals.

digital analytics marketing cycle

As a digital marketer, at the initial planning stage, I try to help you understand your marketing and advertising from your target audience’s perspective as a 3 stage process: entry (usually a website landing page), funnel (or bait), and conversion (or hook). From the entry, we can gather acquisition (where they came from) and engagement (did they interact?) information.

What is digital marketing used for?

In theory, the goal of digital marketing is to continually improve the desired results of your marketing goals. In practice, once we have helped you define your goals, usually based on measurable benchmarks called “conversion points”, the goal is to to increase the conversion rates at these defined conversion points. But if we think of digital marketing in the 3 stages above, we also have important “micro-conversions” to optimize before your target audience even reaches your conversion points.

New Digital Marketing Workflow

A common goal for your entry points is to maximize engagement metrics based on acquisition sources and entry content. One of the most important engagement metrics for entry points is “bounce rate”, which is the rate at which the audience does not interact. A high bounce rate means low average interaction, and so it is bad. A low bounce rate means high average interaction, which is good. We can also compare the bounce rate based on the audience source.

So, for example, we might find that our Facebook ad has a much lower bounce rate than our Google Adwords. At that point, we have a choice of trying to optimize Google Adwords to improve its targeting, or we can decide instead to move our Adwords budget over to Facebook. On further analysis, supposing we had an A/B test running between two different landing pages for both content and offer, we might discover that Facebook ads have the lowest bounce rate with a particular offer or content, and so too with our Adwords. Based on that information, it’s clear that our Facebook ads should be directed to one entry point, and our Adwords to another. Used together, we might find we get the maximum interaction rate for that particular multi-channel marketing campaign.

A/B Split Testing with Google Content Experiments

While a website is often the central media for a digital marketing workflow, since Google’s Universal Analytics in late 2014, digital marketing doesn’t require use of a website. Any digital device or application which can customized with the analytics code will work to transmit the data we need to report and analyze the marketing. Retail companies can often integrate analytics into their POS checkout systems and in store search kiosks. Mobile apps, of course, can be loaded with analytics. Print advertising can use analytics variables, often embedded into QR codes or short urls, in order to transmit relevant data to the digital entry point. Websites often make sense as the go to digital entry point because they are relatively easy to set up and track. The added traffic from a marketing campaign can double to boost your SEO ranks as well.

5 Essential Tools for the Lean Digital Marketer

I’ve compiled a short list of the tools which I consider essential to every digital marketer. The marketing tools I’ve listed below are all free to keep costs “lean”. They are not meant to be self-serve, do it yourself, or even easy to use. Most of them have steep and ongoing learning curves, which makes them most suitable for the professional digital marketer. While there are quite a few semi-automated solutions online now, many targeting the small and mid size business owner or marketing departments, I have found they are all very expensive, even to start. They also tend to have poor customer service, little technical support, and almost no transparency with respect to how they are spending your monthly costs, leaving the business owner with little or no business related marketing choices. Using a good professional digital marketer, you should get the highest quality customer service, understandable explanations of the technical aspects, and full transparency regarding the spend of your digital marketing budget. The professional digital marketer will also provide you with the results, their experienced insights, but leave you, the business owner, with the important business decisions. We’re not here to tell you how to run your business, but to give you the best tools and insights to effectively execute your marketing related business plans.

Google Analytics for the data

Google Analytics is the fundamental tool for the digital marketer. We use Google Analytics to track, collect, and report the relevant marketing data. While there are a number of alternatives to Google Analytics, many of them are far more expensive, less customizable, and less transparent and documented about how their reports are generated. I’m not only a huge fan of Google Analytics, I’m also certified by Google. In addition to providing means for tracking, collecting, and reporting data, Google Analytics provides a Content Experiments tool for A/B Split Testing your content variations. It also helps integrate all your digital marketing into one central place to compare the effectiveness of different campaigns, channels, sources, and content.

Your Website as a central hub

Your website should be used as a central hub for all your digital marketing. That means you should be directing all advertising to tracked entry points on your website. Again, there are a number of other tools available for automating and managing your digital marketing campaigns, but I’m not convinced that any I’ve seen are significantly better than what you can do on your own website. Using your website as a central location to manage all your campaigns seems to me to be the most efficient method in your digital marketing workflow.

Google Webmaster Tools for gauging audience interests

Google Webmaster Tools are primarily used for tracking your website pages SEO metrics for Google Search. While I think that SEO should be a high priority within your overall digital marketing strategy, it is just one source (here, Google) and one channel (organic search) of many to consider and prioritize. While SEO is not essential to digital marketing, I think the additional data it provides is extremely valuable to planning digital marketing campaigns apart from their relation to SEO. Google Webmaster Tools reports which queries (the actual phrasing used) are most used to find your pages in Google Search, for example. That can help you when you are designing and phrasing your advertising content. It can give you an idea which aspects of your business might gain the most traction if promoted.

Your Blog as a social media hub

Social media is a channel you definitely should not ignore if you are doing multi-channel digital campaigns. Your social media might sometimes be the conversion point for the campaign itself. For example, you might use multiple channels targeting gathering followers for your Facebook page. The main advantage of social media is that it streams your content directly to your followers. Instead of waiting for that piece in the mail, an email, or having to do a search for you or your business in order to discover your new content, social media presents your followers with content you publish directly as feeds.

While there are a number of tools to centrally manage your social media content across sources, and many of those are useful if you want to keep up and respond to comments, your blog can be perfectly suited for auto-publishing your new posts to multiple social media platforms at once. For example, if you have added a new product to your website, you can also write a blog post about it (linking to that page on your website), set it to auto-publish to social, and there it is summarized on your social media page as well (linking back to the blog post on your website). Plus, it is automatically sent to your social media followers. So they can link back to your blog post on your website, and from there to your new product, and hopefully you have a good call to action on the product itself.

This is a longer and more complicated funneling process than single campaigns because it involves getting social media followers first. But when executed right, socially streamed content can be extremely effective. And, of course, using Google Analytics we can track the entire process, continually revising it to become more effective. Plus, there are huge SEO benefits to this sort of interactive social process with your website. Today, Google favours these sorts of social interactions with your website much more than traditional “backlinks”. Whenever you can, you should try to double your digital marketing goals with SEO benefits. Auto-posting your blogs to social media is one excellent method.

Your Digital Marketer for their valuable summaries and insights

A certified Google Analytics individual such as myself can help you define and measure your digital marketing goals, provide tailored reports to summarize the data, and do deeper analysis to explain the data and suggest opportunities for improving your marketing efforts. While all the tools I’ve listed are free to use, the expertise of a qualified digital marketer can really help get the most out of these tools and optimize your marketing results. A professional digital marketer works with these tools every day to help businesses like yours maximize their marketing impact and ROI.

Further Information about Digital Marketing

Over the next little while, I will be consolidating and summarizing my work and previous posts regarding digital marketing. I will also explore specific topics related to digital marketing more in-depth, including industry updates as they occur. To read more, see my recent 3 part series on how to maximize the use of your website as a digital marketing tool:

http://allegrasurrey.com/blog/entry/unlocking-the-marketing-potential-of-your-website-br-part-1-landing-pages-funnels-conversions

Please contact me for more information about how I can help you get started or improve your digital marketing.

Continue Reading No Comments

Welcome to Strategic Website 2016

Strategic Web Development and Digital Marketing

It has been about two years since I did a major website revision, which is about the maximum lifespan for a website. Instead of just reworking or redesigning my existing website, I decided to make a big move from Joomla to WordPress. I also totally revised the content to be more of a gallery style portfolio showcase of my web services and work. The written articles will be reserved for blog posts, which is one of the reasons I decided to move to WordPress. Over the next few weeks, I will be consolidating and reposting many of previous articles, as long as they are still relevant.

Why move to WordPress?

Well, I had a number of reasons to switch my website to WordPress. Here are a few reasons I switched my website CMS to WordPress from Joomla:

1. Blogging.

There is nothing wrong with Joomla for blogging, especially for one who is familiar with Joomla. You can do pretty much the same things as WordPress for blogging in Joomla using Joomla’s blog menu types. If you add the EasyBlog component to the Joomla CMS, you get even more blogging options bundled together than using the default WordPress blogging tools. So why switch?

The fact is, WordPress was built as a blogging tool prior to becoming a full fledged CMS. It was branded and marketed as a blogging tool. And it has created the largest community of bloggers within the WordPress.org blogging network. By becoming a regular WordPress user for my website’s blog, I both (a) get involved and exposed within the WordPress blogging community and (b), although I might not be an immediate fan, I familiarize myself with the regular usage of WordPress’ blogging tools, putting me in an excellent position to help others.

Coming from Joomla, where I was perfectly happy with the toolset I had used, I get a unique perspective on both the advantages and shortcomings of blogging in WordPress. Over time, if I find that there are some tools I had which I just can’t blog without, I can develop them as plugins for WordPress.

2. Content based websites should generally use WordPress for their CMS.

I’ve written this before. I divide websites into three categories: content, service, and product. When I recommend a CMS, I first try to determine whether the website is going to be primarily content, service, or product based. From there, my typically proposal starting point will be WordPress for content based websites, Joomla for service based websites, or Magento for product based websites. There are, of course, other considerations such as the availability of ready made add-ons to accomplish all the desired functions for the website, but generally my categorical analysis works well.

So when it came to my own website, a website about web development, website design, and digital marketing… a content based website by all standards!… I had always gone with Joomla! This should be embarrassing since it is contrary to what I recommend to my clients. And the real main reason is that I like Joomla better. I find it easier to customize, design, and develop within. Since it’s my own website, I should be allowed to choose the CMS I like best, right?

Well… yes and no. While I am actually very familiar with WordPress from its core frameworks and API to its administration interface and settings, my familiarity is entirely from developing websites in WordPress for clients. I have no familiarity of the benefits and headaches involved in the regular use of WordPress for my own website. That creates a little bit of a gap between me and my clients, the majority of which are familiar with and prefer WordPress. So I decided I needed to move beyond my own preferences and put the client’s user experience first.

3. Annoyances can inspire innovation.

As a professional web developer, I do much more than just create and customize websites. I also write plugins for frameworks and content management systems. I enjoy coding a new plugin from scratch with only a problem, annoyance, or desired feature in mind. It reminds me of the good old days when everything on the web was programmed from scratch.

There is no denying that WordPress has become the most popular content management system, if not the most popular platform in general for creating websites. With my background, skills, and attitude, I expect that when I run across something I can’t do in WordPress, or something in WordPress which really annoys me, it will inspire me to innovate a wicked solution. And when I have solution, I can release it to the largest community of developers, designers, and content creators on the web.

4. Blog content.

While I do have a ton of content categories to blog about already, I thought that transitioning to WordPress from the perspective of a “new user” but with the experience of a web developer would make for an interesting blog content category. So I plan to keep you posted as learn WordPress again as if it is for the very first time. I will post about my headaches, as well as tips and tricks as I (re)learn them again as if from scratch. This approach should be able to produce the some of most useful tutorials and walkthroughs available for WordPress.

So that’s it for now! If you have any of your own good or bad experiences with WordPress, why not share them below? Also let me know if there are any topics of particular interest to you. They might just find their way into my next post!

Continue Reading No Comments