This is default featured slide 1 title

This is default featured slide 1 title

You can completely customize the featured slides from the theme theme options page. You can also easily hide the slider from certain part of your site like: categories, tags, archives etc. More »

This is default featured slide 2 title

This is default featured slide 2 title

You can completely customize the featured slides from the theme theme options page. You can also easily hide the slider from certain part of your site like: categories, tags, archives etc. More »

This is default featured slide 3 title

This is default featured slide 3 title

You can completely customize the featured slides from the theme theme options page. You can also easily hide the slider from certain part of your site like: categories, tags, archives etc. More »

This is default featured slide 4 title

This is default featured slide 4 title

You can completely customize the featured slides from the theme theme options page. You can also easily hide the slider from certain part of your site like: categories, tags, archives etc. More »

This is default featured slide 5 title

This is default featured slide 5 title

You can completely customize the featured slides from the theme theme options page. You can also easily hide the slider from certain part of your site like: categories, tags, archives etc. More »


With three presidential debates in the can, the loser is…

Earth was hardly mentioned, and that means the loser is actually . . . us

The pundits will no doubt be yammering on for days about who won and who lost the final U.S. presidential debate.

But I’d say that we don’t need the pundits to tell us who the overall loser was. I think it was humanity.

During the debates, we heard a lot of shouting, but almost nothing about the most profound, long-term issue all of us confront: How 7 billion of us, probably growing to 9 billion in relatively sh

Original Source File

2 A/B Testing Failures and 1 Win: Why A/B Testing Failures Are as Important as the Wins

You’ve probably seen a lot of blog posts floating around the internet about A/B testing successes. This blog is no exception.

But hardly anyone talks about their failed experiments. I don’t blame them–it’s hard to expose to the world that you weren’t right about something.

Which leads us to believe…is anyone running failed, insignificant tests? I mean, if no one’s talking about it, it must not be happening, right?

Let me tell you a secret: Everyone is failing. Everyone is running or has run an experiment that got them nowhere.

However, at Kissmetrics, failure is part of our A/B testing process. If none of our tests fail, we know we’re not running enough tests or our ideas are too safe.

In fact, the bigger the failures, the closer we are to an even bigger win.

We’re never 100% correct about our hypotheses. No matter how many years of experience you have, no matter how much you think you understand your customers…there’s always room for learning.

Otherwise, why would we test in the first place?

Now let’s take a look at a couple of our own failures so you can see what I mean.

Failure #1: Too much white space on our product page

Test Hypothesis: There’s too much clutter at the top of the page. By removing the background image and reducing white space, we’ll make the page copy more visible, enticing people to scroll down and interact more with the page.

You already know that this test failed, but just from looking at the hypothesis–do you know why?

I’ll give you a hint: it has a lot to do with data.

We technically had data that indicated a dip in conversion on this page.



However, we didn’t have evidence that people weren’t scrolling down, or that the space at the top was stopping them from converting. When a hypothesis has little or no evidence, we have a slim chance of winning the test.


Improvement over original: 4.41%
Certainty: 55.27% for the variant

What we learned:

No significant data here.

Having a hero image or not in this page won’t influence our conversions. In previous homepage tests, our hero image mattered, but perhaps not on this exact page.

In previous tests on this particular page, we experimented with the copy and overall messaging. Therefore, our next test should be around the copy to see if we’ll get a lift.

Failure #2: Copy and images on the product page

Test Hypothesis: A more benefits oriented product landing page will lead to better quality leads.




Improvement over original: -13.68%
Certainty: 80.12%

Improvement over original: 8.77%
Certainty: 60.78%

What we learned:

In the last test (failure #1), we didn’t change enough of the page for there to be significant results.

This time, we changed 1) the copy, 2) product screenshots, and 3) the overall layout of the page below the hero image.

That’s a lot, right?

Here’s the thing: when we change both copy and design, it’s hard to tell whether it was the copy or the design was the reason for a lift or a decrease. We can’t isolate the variable and know what was responsible for the outcome.

In previous tests, tested the copy first, then test the design after. That’s what we’ll do for our next test.

The Win: Product page headline

Test Hypothesis: Adding a benefit centric headline to the product page will increase demo requests and signups because we’re showing them the the value they’ll get from Kissmetrics. All our happy customers have said in interviews they love seeing individual user behavior in Kissmetrics. But we’ll take that one step further and add the result to the end of the headline.




Improvement over original: 163.46%
Certainty: 97.33%

Requested Demo:
Improvement over original: 507.97%
Certainty: 99.67%

What we learned:

Finally, a win! And it only took us 2 failed tests to get there. Not bad.

The increase in requested demos is huge. We didn’t see 99% significance on the signups, so we can’t say for sure that’s a win–but a 507.97% in demos is worth launching.

The major learning here is that the headline on our product page carries a lot of weight. We didn’t change the rest of the page, or the Call to Action copy. A good next test would be to test the rest of the landing page copy to see how much weight it carries.

And finally, having user interviews and user reviews made our hypothesis strong. Yes, benefit-centric copy is a good thing, but what benefit? What do our ideal customers absolutely love about our product?

Having this research evidence from our customers made the win even bigger.


All failures lead to a big win when you’re extracting a major learning from each test. The best part about these learnings is that they’re unique to YOU and your company–meaning, one test that worked for us might not work for you.

However, the best way to get these learnings and big wins from your A/B testing is to have a top notch system in place.

At Kissmetrics, our A/B testing program isn’t only about optimizing our conversions. We’re optimizing the system of testing…that eventually leads to more conversions.

If you want to learn how to get major A/B test wins, join our free email course to learn:

  • The proven system of testing we’ve used to triple our conversion rate (most companies spend years trying to figure this out)
  • A foolproof checklist for ensuring each test can lead to a big win (Yes, this can work for you if you apply it correctly. We’ll show you how)
  • What we learned from over 70 tests we’ve run on our website–including the #1 mistake even the most seasoned professionals make that could negatively impact your funnel

About the Author: Allison Carpio is the Product Marketing Manager for Kissmetrics.

$(function() {

Original Source File

CLIMBER B.C. Spring Summer 2017 | CPM Moscow by Fashion Channel

Source Article

The Funniest Wildlife Photos of 2016

The middle of October during a presidential election year is a really good time to remember not to take the world so seriously.

Case in point: The organizers of the Comedy Wildlife Photography Awards announced the finalists for the funniest animal photograph of 2016. The competition, in only its second year, is the antithesis of your traditional, staid wildlife photography contests. It’s a light-hearted competition that showcases the inner comedians in the animal kingdom, and it doesn’t t

Original Source File

How to Run an A/B Test in Google Analytics

Designs don’t always work out as intended.

The layout looks good. The color choices seem great. And the CTA balances clever and clear.


It’s not working. All of it. Some of it. You’re not completely sure, but something’s gotta give.

Despite everyone’s best intentions, including all the hours of research and analyses, things don’t always work out as planned.

That’s where continuous testing comes in. Not a one-and-done or hail & pray attempt.

Even better, is that your testing efforts don’t need to be complex and time consuming.

Here’s how to set-up split test inside Google Analytics in just a few minutes.

What are Google Analytics Content Experiments?

Let’s say your eCommerce shop sells Pug Greeting Cards. (That’s a thing by the way.)

Obviously, these should sell themselves.

But let’s just suspend disbelief for a moment and hypothesize that sales are low because you’re having trouble getting people into these individual product pages in the first place.

Your homepage isn’t a destination; it’s a jumping off point.

Peeps come in, look around, and click somewhere else.

Many times that’s your Product/Service pages. Often it’s your About page.

Regardless, the goal is to get them down into a funnel or path as quickly as possible, (a) helping them find what they were looking for while also (b) getting them closer to triggering one of your conversion events.

The magic happens on a landing page, where these two things – a visitor’s interest and your marketing objective – intertwine and become one in a beautiful symphony.

So let’s test a few homepage variations to see which do the best job at directing new visitors into your best-selling products.

One has a video, the other doesn’t. One is short and sweet, the other long and detailed. One has a GIF, the other doesn’t.


New incoming traffic gets split across these page variations, allowing you to watch and compare the number of people completing your desired action until you can confidently declare a winner.

(It’s probably going to be the one featuring this video.)

Running simple and straightforward split test like this is landing page optimization 101, where you identify specific page variables that result in the best results for your audience and multiply them across your site.

Google Analytics comes with a basic content experiments feature that will allow you to compare different page variations, split traffic to them accordingly, and get email updated about how results are trending and whether you’re going to hit your defined objective or not.

But… they’re technically not a straightforward A/B test. Here’s why, and how that’s actually a good thing.

Why Content Experiments Can Be Better than Traditional A/B Tests

Your typical A/B test selects a very specific page element, like the headline, and changes only that one tiny variable in new page variations.

The interwebs are full of articles where switching up button color resulted in a 37,596% CTR increase* because people like green buttons instead of blue ones. Duh.

(*That’s a made up number.)

There’s a few problems with your classic A/B test though.

First up, tiny changes often regress back to the mean. So while you might see a few small fluctuations when you first begin running a test, small changes usually only equal small results.

changes-regress-to-the-mean-ab-test(Image Source)

The second problem is that most A/B tests fail.

And if that weren’t bad enough, the third issue is that you’re going to need a TON of volume (specifically, 1,000 monthly conversions to start with and a test of at least 250 conversions) to determine whether or not those changes actually worked or not.

Google Analytics Content Experiments use an A/B/N model instead. Which is like a step in between one-variable-only A/B tests and coordinated-multiple-variable multivariate tests.

(After typing that last sentence, I realized only hardcore CRO geeks are going to care about this distinction. However it’s still important to understand from a high level so you know what types of changes to make, try, or test).

You can create up to 10 different versions of a page, each with their own unique content or changes.

In other words, you can test bigger-picture stuff, like: “Does a positive or negative Pug value proposition result in more clicks?”

Generally these holistic changes can be more instructive, helping you figure out what messaging or page elements you can (and should) carry through to your other marketing materials like emails, social and more.

And the best part, is instead of requiring a sophisticated (read: time consuming) process to set up to make sure all of your variable changes are statistically significant, you can use Google Analytics Content Experiments to run faster, iterative changes and learn on-the-go.

Here’s how to get started.

How to Setup Google Analytics Experiments

Setting up Content Experiments only takes a few seconds.

You will, however, have to set-up at least one or two page variations prior to logging in. That topic’s beyond the scope here, so check out this and this to determine what you should be testing in the first place.

When you’ve got a few set-up and ready to go, login to Google Analytics and start here.

Step #1. Getting Started

Buried deep in the Behavior section of Google Analytics – you know, the one you ignore when toggling between Acquisition and Conversions – is the vague, yet innocuous sounding ‘Experiments’ label.

Chances are, you’ll see a blank screen when you click on it that resembles:


To create your first experiment, click the button that says Create Experiment on the top left of your window.

With me so far? Good.

Let’s see what creating one looks like.

Step #2. Choose an Experiment

Ok now the fun starts.

Name your experiment, whatever.

And look down at selecting the Objective. Here’s where you can set an identifiable outcome to track results against and determine a #winning variation.


You have three options here. You can:

  • Select an existing Goal (like opt-ins, purchases, etc.)
  • Select a Site Usage metric (like bounce rate)
  • Create a new objective or Goal (if you don’t have one set-up already, but want to run a conversion-based experiment)

The selection depends completely on why you’re running this test in the first place.

For example: most are surprised to find that their old blog posts often bring in the most traffic. The problem? Many times those old, outdated pages also have the highest bounce rates.

Navigate to: Behavior > Secondary Dimensions + Google/Organic > Top Pageviews > Bounce Rate.

Here’s an example:


(Here are a few other actionable Google Analytics reports to spot similarly low hanging fruit when you’re done setting up an experiment.)

Let’s select Bounce Rate as the Objective for now, so we can make page changes to the layout, or increasing the volume and quantity of high quality visuals to get people to stick around longer.

After selecting your Objective, you can click on Advanced Options to pull up more granular settings for this test.


By default, these advanced options are off, and Google will “adjust traffic dynamically based on variation performance”.

However if enabled, your experiment will simply split traffic evenly across all the page variations you add, run the experiment for two weeks and shoot for a 95% statistical confidence level.

Those are all good places to start in most cases, however you might want to change the duration depending on how much traffic you get (i.e. you can get away with shorter tests if this page will see a ton of traffic, or you might need to extend it longer than two weeks if there’s only a slow trickle).

So far so good!

Step #3. Configure Your Experiment

The next step is to simply add the URLs for all of the page variations you want to test.

Literally, just copy and paste:


You can also give them helpful names to remember. Or not. It will simply number the variants for you.

Step #4. Adding Script Code to your Page

Now everyone’s favorite part – editing your page’s code!

The good news, is the first thing you see under this section is a helpful toggle button to just email all this crap code over to your favorite technical person.

If you’d like to get your hands dirty however, read on.


First up, double check all of the pages you plan on testing to make sure that your default Google Analytics tracking code is installed. If you’re using a CMS, it should be, as it’s usually added site-wide initially.

Next, highlight and copy the code provided.

You’re going to need to look for the opening head tag in the Original variation (which should be located literally towards the top of your HTML document. Search for to make it easy:


Once that’s done, click Next Step back in Google Analytics to have them verify if everything looks A-OK.

Not sure if you did it right? Don’t worry – they’ll tell you.

For example, the first time I tried installing the code for this demo I accidentally placed it underneath the regular Google Analytics tracking code (which they so helpfully and clearly pointed out).


After double checking your work and fixing, you should see this:


And now you’re ready to go!

See, that wasn’t so bad now was it?!


Websites are never truly done and finished.

They need iteration; including constant analysis, new ideas, and changes to constantly increase results.

Many times, that means analyzing and test entire pages based on BIG (not small) changes like value propositions or layouts. These are the things that will deliver similarly big results.

Landing page optimization and split testing techniques can get extremely confident and require special tools that only CRO professionals can navigate.

However Google Analytics includes their own simple split testing option in Content Experiments.

Assuming you already have the new page variations created and you’re comfortable editing your site’s code, they literally only take a few seconds to get up-and-running.

And they can enable anyone in your organization to go from research to action by the end of the day.

About the Author: Brad Smith is a founding partner at Codeless Interactive, a digital agency specializing in creating personalized customer experiences. Brad’s blog also features more marketing thoughts, opinions and the occasional insight.

Original Source File

YVES SAINT LAURENT SS 2000 Paris Fashion Week 4 of 5 pret a porter woman by FashionChannel

Source Article

Skin Cell to Egg Cell to Mouse Pup: Researchers Hail Fertility Breakthrough

Japanese researchers announced Monday that they used stem cells to create viable mouse egg cells entirely in the lab.

Scientists from the University of Kyoto coaxed skin cells into egg cells, which they then fertilized and implanted into female mice to successfully breed a new generation of mice. The technique had a low success rate, but it’s the first time lab-grown egg cells have been used to produce healthy offspring.

Previously, members of the same team had created mature egg cells

Original Source File

When Good Customers Leave: Troubleshooting Your Customer Retention Approach

Every business needs a steady stream of new leads coming in the door. New business is crucial for survival. If you’re not feeding your funnel, then you’ll limit your capacity for growth and eventually tank.

This is why a lot of businesses heavily focus on new customer acquisition. They build out sales teams with aggressive goals to generate new business and keep the growth train rolling. According to a survey from Statista, the number one challenge facing small business owners in the U.S. was attracting new customers.

Many of those businesses fail to realize that customer retention usually has more profound impacts than efforts to acquire new customers, and neglecting customer retention can lead to the untimely demise of a business.

You have to find unique ways to keep the customers you have because they are your greatest source of revenue.

The Value of Customer Retention

On average, your loyal customers are worth a great deal more than a new customer. If you keep them happy, you can expect a much higher lifetime value for your average customer. In fact, a returning customer can be worth up to 10x more than a first-time customer.

That’s because the average repeat customer could spend up to 67% more around their third year with a business compared to the first six months of their initial relationship. Over time, you develop a strong rapport with customers that come back, and satisfaction and happiness will continue to rise if you consistently nourish your connection with them.

Even if you only make small changes intended to delight your customers, you might see a massive shift in revenue as a result of increased loyalty toward your brand.

Just look at these example scenarios created by Punchline:

punchlime-success-metrics-churnSlashing customer churn can significantly impact your bottom line. (Image Source)

The image here illustrates how revenue can increase when a company takes the time to focus on customer retention. In both scenarios, the acquisition strategy remains unchanged. By slashing customer churn in half however, the company is able to almost double its total revenue.

The Most Common Causes of Churn and How to Fix Them

In order to retain customers, you first have to understand what’s driving them away. Simply put, some causes are beyond your control. However, there are other causes that you can task your team with correcting.

1. Poor Customer Service

Poor service is arguably the top reason why customers bail on businesses and never return. As many as 78% of customers have skipped out on a purchase due to a poor service experience, and 90% of customers have reportedly abandoned a business altogether due to bad customer service.

That’s a significant chunk of business walking out the door, and with just 4% of dissatisfied customers actually reporting problems, you’ll be hard-pressed to find out where you dropped the ball.

top-reasons-cart-abandonmentSome service issues might relate to your UX or problems in the funnel/cart.

Failures in service can happen at any point with a customer, from the initial purchase phase to the follow-up phase. Customer service screw-ups can also happen on any channel where engagement takes place: social media, email, over the phone, live events, in-store interactions, etc.

How to Fix It:

2. Ineffective Onboarding

A simple product manufacturer isn’t likely to experience onboarding issues. Very little education is required for someone to understand what to do with a statue they purchased from ThinkGeek, or artisan clothing from a supplier.

More complex systems, like subscription-based software or services, are a different story. The onboarding period, where you assist the customer starting with initial entry to and ending in success, is critical for retaining customers. Considering the amount of work you put in to get the customer to close, it would be painful to lose them shortly thereafter.

trello-onboardingTrello, a project management tool, built its onboarding guide within the product.

If the onboarding process is confusing or nonexistent, then the customer might get frustrated and lost, or fail to understand the full value of what you have to offer. This is the easiest point for a customer to decide that they don’t want to continue with the investment and bail on you.

How to Fix It:

  • Start with great customer service: welcome the customer via email and provide them with a dedicated rep to guide their journey
  • Use onboarding tools to develop tutorials, guides, and usage cues
  • Create an onboarding autoresponder that encourages interaction. Discover their goals and respond with resources/info to help them reach those goals
  • Identify the success triggers of your best customers; use those as benchmarks for guiding new users to success
  • Use customer feedback to uncover common friction points; adapt products, services, and information to eliminate those friction points or help customers overcome them
  • See roadblocks that occur during onboarding with the Funnel Report
  • Crowdsource usability testing to uncover UX issues with services like UserTesting
  • Get feedback while customers are using your app or service with Wootric
  • Continually provide value during the onboarding phase to help customers find that initial success, and remember to maintain engagement beyond that sticking point

3. Success Falls Off

That first success point is great if you can get the customer there through onboarding. However, your interactions and responsibilities don’t end there. You have the entire customer lifecycle moving forward, and you need to regularly check in with customers to ensure they understand how to use and benefit from your product.

If communication drops off and there’s no development in the relationship, the customer is left wondering whether the value is still there over time. If they don’t understand how to leverage your product and feel the value is diminishing, then you’re risking a big reduction in the overall lifecycle of the customer by leaving them to their own devices.

abandonment metricsThis comparison of GrooveHQ users shows an average user compared to a user on the verge of abandoning the service.

How to Fix It:

  • Develop red flag metrics that can identify customers who are struggling or at risk of leaving
  • Monitor customer metrics: reach out and re-engage those who either haven’t ordered yet or haven’t activated and accessed their accounts
  • Talk to customers and ask them, directly or through surveys, why they’re struggling
  • Segment at-risk customers and create engagement strategies specific to their struggles
  • Don’t focus your marketing solely on acquisition; create high-value content that targets existing users and help them attain more success

4. Natural Events

Very few customers stick around for the full life of your business. Eventually, almost every customer will churn and there are a number of natural occurrences that lead to this. These instances often have very little to do with anything you or your team failed to do.

Natural churn can happen due to:

  • Personnel changes leading to the onboarding of different products or services
  • Changes in the company’s processes, which require different solutions
  • Customers relocating or going out of business
  • Companies outgrowing a product or service
  • Budgetary changes

Although you can’t directly prevent natural occurrences from happening, you can use this information to improve retention and perhaps save some customers.

pro tipmention reduced churn by emailing pro tips that showcased their features and promoted the value of their service.

How to Fix It:

  • Watch for red flag metrics that show steady declines that could lead to a service cancellation or lost customer
  • Use customer exit surveys to solicit feedback
  • Schedule regular outreach to monitor the changing needs of your current customers
  • Send value-based emails encouraging use and activity

5. Pushing Too Hard

Your customers are going to purchase something from you for a very specific reason: they have a problem and they’re looking for a solution. Once they have a solution, they won’t need another one until they’ve identified another problem.

That means you can’t just motivate them with aggressive sales tactics or harass them with transaction-based emails to get them to purchase more.

When the majority of your interactions with a customer involve attempts at upselling with little extra value, then you can expect to lose some customers. It won’t take them long to find a competitor with a similar solution who doesn’t persistently hound them for money.

How to Fix It:

  • Know your customers and their purchasing habits. This way, transactional emails and promotions will fit their behaviors better
  • Make sure promotional outreach is no more than 20% of interactions; fill the other 80% with value
  • Track the open and click-through rates of emails; if those numbers are falling on transaction emails, then that indicates you’re pushing too hard and losing engagement as a result
  • Coach sales and account managers on providing value before making sales; build relationships to reduce churn and increase the lifetime value of customers


Every company experiences churn. Reducing it is not rocket science and it doesn’t require a complete overhaul of your processes or operations. The key is to first identify the source of the churn. Segment your at-risk customers and develop strategies for fixing the causes. In many cases, you’ll make small, incremental adjustments that are easy to deploy and will go a long way toward delighting customers and reducing churn.

What are some of your approaches to customer retention? How do you keep your current customers coming back for more? Share your ideas with me in the comments.

About the Author: Aaron Agius is an experienced search, content and social marketer. He has worked with some of the world’s largest and most recognized brands to build their online presence. See more from Aaron at Louder Online, their Blog, Facebook, Twitter, Google+ and LinkedIn.

Original Source File

LOLA CASADEMUNT 080 Barcelona Fashion Week Spring Summer 2017 by Fashion Channel

Source Article

How to make a vortex of semen.

Ever want to make a whirpool-like vortex out of your semen? Well, according to this published study, it’s not too difficult:

Step 1: Concentrate the sperm in your semen.

Step 2: Put it into a annular-shaped container (a ring formed by two concentric circles).

Step 3: Stand back to admire the beauty that is your very own semen vortex.

Apparently, concentrating the sperm induces them to align and swim in the same direction around the ring, creating a vortex… a vortex of semen!

Original Source File