Friday 29 November 2013

[Build Great Backlinks] TITLE

Build Great Backlinks has posted a new item, 'Kyle Rush Reveals How the Obama
Campaign Broke Every Online Fundraising Record: Free #MozCon Video'

Posted by EricaMcGillivray



Every year at MozCon, I have the joy of working with our fabulous MozCon
speakers. One of the speakers, who we were most excited about for MozCon 2013,
was Kyle Rush. Kyle's name might not be on the tip of your tongue, but he worked
on possibly the biggest and best online marketing campaign, Obama for America,
as their deputy director of front-end web development. From there, he went to
The New Yorker, and he just announced that he's headed over to Optimizley.


When Kyle told us he wanted to present about the conversion rate optimization
and a/b testing the Obama campaign did, there may have been some squeeing from
Rand (like the Packers won) and me (like over new Sherlock episodes). Marketing
nerds. Because regardless of your politics, Obama's reelection campaign not only
broke fundraising records, but changed the way we think about using big data and
CRO.


Kyle rocked that MozCon 2013 stage. He presented a ton of actionable
information for attendees, and he was one of our top scoring presentations. When
we went to decide which full-length MozCon presentation to share with all of
you, for free, Kyle's was it. Enjoy!






MozCon 2013 free video - Kyle Rush - Win Through Optimization and Testing




Video Transcription




Kyle: Thank you, Cyrus. It feels great to be in Seattle. I just came from New
York City. Is anybody else here from New York? Yeah. You guys all know what I
mean when I say it feels great to be in Seattle. You guys know how to do the
summer with this 77 degree weather. This dry heat is awesome. We've got to
figure out how to get that in New York City. Can we get on that?


As Cyrus said, my name is Kyle Rush. I'm currently at 'The New Yorker.' Before
that I was at the Obama campaign. I worked on a lot of the product and tech
aspects of our online fundraising. Obviously, we ran a lot of optimization on
that. So, that's what I'm going to be talking to you guys today about.


Before we get started, I want to give you guys some context on what we jumped
into, the situation on day one at the Obama campaign. All the media outlets at
the time were reporting that we were expected to raise one billion. They did
probably $700 million in 2008. So, we were expected to raise one billion.


Just to put that into perspective for you guys, Amazon's Q4 profit for last
year was only $97 million. So, when you spread that out over a year and a half,
which was the life of the campaign, you still only get like half or a little
over half what we were expected to raise on the campaign. So, this was a pretty
daunting challenge.


But, in the end... Oh, I didn't mean to click to. But, in the end we did $1.1
billion. So, we exceeded expectations. None of us thought we could do it.
Obviously, that's a lot of money. We did $690 million of it online as Cyrus
said.


Another thing that I want to talk to you guys about is just an example of one
of our online fundraising programs. That was called Quick Donate. This was a way
for our users to save their payment information so that they could do one click
donations on the Web, and they could also do one click donations in email -
which had never been done before. So, we had to do a lot of funky engineering to
get that to work.


But, you could also SMS donate which was a first for political campaigns. It
was actually a big achievement for us. Because the Federal Election Commission
said that political campaigns can not use short codes to fundraise. So, we
weren't allowed to work with AT&T and Verizon to send out short codes and ask
people to text those. We had to engineer a way around that. When we launched SMS
donations it was the first of its kind.


Quick Donate brought in $115 million over its lifespan. It had 1.5 million
users. This was a thrill to work on. But, obviously, this type of program we
optimized. We ran a lot of tests. Those are kind of the things I'm here to share
with you guys.


You might ask how did we get here. We ran 500 experiments. We always had a
test running. It was really, really intense the amount of traffic that we had.
We did weeks of user testing. User testing is really simple. It's just putting a
user in front of a computer and observing them.


We used a program called Silverback. I don't know if any of you guys are
familiar with it. But, it records the eyesight camera and the computer screen at
the same time. So, you can actually see your user making a donation. We learned
a lot from this. We did it on and on and on to the point where we probably did
weeks of it.


Sorry, this thing is pretty sensitive.


We also just did general data gathering which I really like to do. Because if
you're not gathering data then you're kind of flying blind. Just a data point to
show you guys how much data gathering we did, we did over 668 million Google
Analytics custom events. I'll be talking about those in a minute. But, that's a
ton. I don't think that I've ever worked at a place that pushed Google Analytics
to the point that we did on the campaign. It was pretty intense.


You might ask 'What did that all get us?' It got us a 49% increase on our
donation page conversion rate. And, it got us a 161% increase in our email
signup page. These are two really high level conversion goals for us.


The email signup you might not have known about. We didn't really talk about
it. But, I'll let you in on a little secret. Email is responsible for just about
90% of our online fundraising. So, gathering emails on our list was super
important. We spent a lot of time optimizing email acquisitions.


The three things that I want to talk to you guys about today, and this is
really what optimization means to me, is experimentation. I think we're all
mostly familiar with this. This is A/B testing, multi variate testing. The
second is observation, and that's what I was talking about when I was talking
about user testing. You want to observe your users using your product.
Otherwise, you're not going to know how they're using it. Because you're not a
user. Also, just general data gathering, which is super important.


First up is experimentation. Sorry. This thing's super sensitive. We
identified a process when we were on the campaign. I want to share that with
you. I'm sure everybody has their own processes. But, this is what really worked
for us.


The first step for us in experimentation was to identify our goals. I mean
this from both a micro and a macro level.


On the macro level I just talked about some of our goals which was email
acquisition and donations. You need money to win a campaign. In our instance we
needed emails to get that money.


But, I also encourage you to focus on micro goals. This is like conversion
goals when you're running tests. You should just measure everything. So, micro
goals can be like the error rate on a form, like how many errors do you get when
somebody mistypes their email address. Is the label clear enough there? You just
really want to measure everything.


One thing that really blew me away on the campaign is that we started
measuring the conversion rate on the follow-up page. So, when you made a
donation and it was successful you got taken to a follow up ask that asked you
to save your payment information.


That was Quick Donate. That was the opt in to Quick Donate. That was a very
critical conversion goal for us, because we found out early on that Quick Donate
users were four times more likely to make a donation in the future. That's like
money right there that we needed to focus on.


We measured that goal even though we weren't changing that page at all. We
were changing the donation page. Then, we found out that some of the variations
that we ran actually affected the follow up page. It's really, really important
to measure as many conversion goals as you possibly can when you're doing your
experiments just to get a good sense of what's going on.


The second step that we would do is develop hypotheses. This is really
important. It's just basically like the scientific process that you guys all
learned in grade school. Develop your hypotheses and then test them. This is
really helpful in making sure that you're staying focused.


It's really easy to fall in this trap when you realize how much you can test.
You just start to test everything. You don't want to make any decisions. You
just want to test. It's like, 'Oh, what color should the submit button be?'


'I don't know, test it.'


Don't do that. That's not a good idea.


Create high level hypotheses. One of ours, for example, in the campaign was
that less copy does better than more copy for conversions. So, we tested that on
our splash page. We tested that on our donate page. We tested that on our email
sign up page. We tested it everywhere on the site. We figured out different
experiments to test it.


That's actually number three here is to create experiments. Create many
experiments to test your hypotheses. You might want to test the same experiment
more than one time. Because you might get different results in the time of the
day. There are all kinds of weird things that can happen. Test it multiple times
and create several experiments that test your hypothesis.


Oh, wow. The fourth, and I can't stress this enough, is to prioritize with
ROI. I touched on this a little bit earlier. But, as you start building out your
experiments... I'll iterate this with an example from the campaign.


We ran an experiment where on our donate page we had a picture of the
President behind a donate form. That was our control. But, then we added an
inspirational quote above the President's head. It said something like 'Stand
with me, work with me, let's finish what we started.'


When we tested that we got something like a 17% increase in conversions.
Because it made the page just a little bit more inspirational and made people
really want to finish and stand with the President. That was awesome.


That was just adding copy. That only took us, like, a couple of minutes to get
onto a page and actually into production when it won. So, ROI on that is really
high.


Our finance team wanted us to implement paying by check, because they had some
data that said a lot of people don't have credit cards. Maybe they have checks
that they can pay with. It sounds like a crazy idea to me, but the data that we
got from them said that we could expect a 3% increase in the conversion rate.


But, on the technical side that was kind of a big lift. That would take days,
if not weeks, to implement. We're only going to expect a 3% lift. So, when it
comes to figuring out what experiments are going to give you the highest ROI,
just really dig into the data and make sure that you're focusing on experiments
like the inspirational quote and not things like changing your whole donation
system for just a 3% increase in donations.


The fifth one is very easy - test your ideas. Then, lastly, you want to record
results. I can't stress this one enough either. Because on the campaign what
happened is we ran so many tests - 500 total - that we couldn't always remember
what the result from one test was.


If we didn't have this awesome Google doc that we built out that recorded the
time, the hypothesis, the result, a screen shot of the control and the variation
and the results, and a link to the results, an optimized link, if we didn't have
all of that we really couldn't have functioned. Because you just can't remember
the results of 500 tests.


You can also disseminate that information when you have it in a Google doc.
Just make sure that you're recording your results.


Now, I just want to talk about four areas where you can experiment. I've
ordered these by ROI. Copy is, in my experience, by far the highest ROI that you
can experiment with. It's very simple, because you don't have to change any code
or anything. Changing copy only takes a minute or two, and the results that you
can get can be really awesome.


Here is the Quick Donate opt in page that I was talking about before. This is
the page where if you make a successful donation we ask you to save your payment
information for next time.


We did a variation of the header. This one says 'Save your payment information
for next time.' Very simple, right. Then, our variation changed the copy and it
said,'Now, save your payment information.' It only changed a few words around.
It's not a huge change. Obviously, it only took us like a minute to get this
test into production.


By making the copy more direct and directing the user into what we wanted them
to do we got a 21% increase on conversions. Again, this is very little
development effort, but a huge result in conversions, or conversion lift I
should say. Here you can see if you missed it before what the control and the
variation was.


After copy, the next highest ROI area of experimentation that I would say is
imagery. Because it's very easy to switch images out, almost the same as copy.
It takes a little bit longer, though.


Here's an example of what we did on the campaign with imagery. This is our
splash page for the 'Dinner with Barack' contest which is a super cool contest.
You could actually win dinner with Barack. They would fly you out to Washington,
DC. You'd sit down with Barack and have dinner. Sometimes Michelle would be
there. Actual people won this contest. After you submit you would get entered
into that.


Here we have a picture of the President. We figured out early on that big
smiling pictures of the President worked because people love him. We had a
hypothesis that people would be more likely to submit this if they could picture
themselves in that scenario. You can't really see the people that he's talking
to. It doesn't really seem like a real contest. It's like, 'Could I really have
dinner with Barack Obama?'


So, we came up with a variation that gave the user a view of a little bit more
of the situation. Those are two actual people on the right that won this
contest. They flew them out, and they had dinner with Barack and Michelle.


The results of this putting a more situational image in there gave us a 19%
lift in the conversion rate. Again, this does not take a lot of time to
implement. It's just a very easy test. We got a huge lift on it.


Here are the two different images so that you can see them again.


Another area that I want to talk about is performance. This is going to be a
little bit techie for technical. But, you guys are all probably very familiar
with how page load affects conversion rate. We were, too. Early on in the
campaign we knew that Amazon had published a statistic, and it's a crazy
statistic, that even 100 milliseconds of additional latency on page load could
drop the conversion rate by one percent. So, that's like huge.


We're obsessed with performance. We want to make our pages as fast as
possible. Here is a look at the architecture diagram for the platform that we
started with. It's very simple. It's very basic. It was built by a company
called Blue State Digital which was one of our vendors. I actually came from
there before I started at the campaign.


It worked really well for us in the beginning, because it was built out of the
box. As the first engineer there I didn't have time to build a new platform.
This was already out there and working.


The user makes requests to a load balancer, and that splits requests to two
clusters. If you're asking for the page it would send you to the web cluster. If
you actually hit submit on the form it would send you to the payment cluster.


Very simple, but there were a lot of problems with this in terms of
performance. We, on average, saw five second page load time which is horrendous
when you're processing $690 million worth of donations. You want something more
like below two seconds, or how about zero seconds. Can we get the page to just
load automatically?


It didn't have a CDN. I don't know how many of you people here are familiar
with CDN. That's content delivery network. If I'm in LA and I request a page, in
that architecture diagram the servers were in Boston. So, the data has to go all
the way from Boston to LA. If you put it on a CDN... We used Akamai. There's an
Edge server in LA, so it gets it to you much quicker.


There wasn't any caching in this environment. There were a lot of things that
we needed to change. We basically started from scratch and built a new platform.
We asked Blue State to turn their hosted platform into an API that we could hit
on the client side.


Here's what that looked like. I'm going to run through it really quickly. We
put our static assets, which is our JavaScript files, our images, our CSS and
such, on an Amazon AWS S3 bucket which is a super simple data store. It's
awesome.


Then, we put the Akamai CDN in front of that. So, we have really fast access
to those. Then, we generated our HTML, the actual pages for these, with a static
site generator called Jekyll which is built in Ruby. It's super simple to work
with. It's great for front end engineers. They don't have to worry about server
side templates and all of that stuff.


Then, we hosted all those HTML files on AWS S3 just like our static assets,
and we put Akamai in front of that. The cool part is the two donation
processors. Like I said before, Blue State built a donation API for us to post
to, and then they had load balancing on their end. They had two nodes behind
their endpoint.


We put ours on EC2, and we put them in two different regions. We put one
payment processor in California, or it may have been Oregon. But, it was on the
west coast. We put another payment processor in Virginia on the east coast.


So, if you had an IP address that was in the western side of the United States
you'd be sent to the west coast payment processor, and the same for the east. If
the west coast went down for some reason... There was actually a hurricane in
Virginia and actually caused EC2 servers to go down during the campaign. All
that traffic just got sent to the west coast. It was great. It was very
redundant.


Once we got this system in place there was never a down time for accepting
donations. We were accepting donations 100% of the time.


The new platform, the biggest metric I think is that it had an 80% faster time
to paint. That means how fast the user puts something on the screen, not page
load. The browser can start rendering the page, and the page load metric can
still be going on because maybe it's loading some JavaScript or something that's
not critical for page load. I like to focus on time to paint. We got 80% faster
here.


To show you what that is, what that looks like, I use WebPagetest - which you
guys should all use if you're not using it now. It's super easy to get data like
this. The top film strip shows you that that's the fast platform. In one second
we have a painted screen. That's a screen that the user can start filling out a
donation. That's super fast. The only thing that's not loaded is the graphic
assets. Those load by two seconds.


You can see our old platform doesn't even have anything on the screen by four
seconds. That's awful.


We did a lot to increase the performance here. We had a 63% reduction in page
weight. We just threw out all that legacy code and wrote our own. We went from
something like 720 kilobytes to, like, 120 kilobytes. Then we had a 52%
reduction in HTTP requests which is one of the most common things that
contribute to page latency.


What did we get with an 80% faster time to paint? An increase in conversions
by 14%. To measure that, we made a page on the fast platform that was identical
to the slow platform. Then, we A/B tested them with Optimizely. 14% is not as
big as the numbers I was talking about before, but this was in the beginning
when we first launched this platform. This was the A/B test to put it into
production.


When you calculate the $250 million that this platform brought in over its
lifetime that's $32 million dollars. I'll take that. The money raised on the
campaign was tight. Just by making that 80% faster we got $32 million.
Obviously, this takes a lot more engineering, time, and effort, which is why
it's less ROI than the copy and the imagery. But, this is huge. This is $32
million dollars that we got just by making that faster.


The second area of optimization that I want to talk about is... Sorry. This is
experimentation and user experience, which also takes a little bit more time.


The screen that you're looking at right now is a donate page that is already
super optimized. This was later on in the campaign. We had run hundreds of tests
on this page, and it was performing brilliantly. We ran a lot more experiments
on it to try and increase the conversion rate, and we kept failing. We couldn't
get the conversion rate up. So, we got really frustrated and we couldn't figure
out what to do.


We decided to try something big. What we did is on the variation we chunked
the donation experience into four parts. Because if you look at this slide right
here you see all 16 fields. It looks very intimidating to fill out. It looks
like it's going to take you forever. But, if you look at this one all you have
to do is select an amount. That's a much lower barrier for entry on engagement
here. Then, you just go through that and it guides you through very nicely.


We tested this one. I like to call this the gradual incline instead of steep
slope. We got a 5% conversion lift. Obviously, that's not as big as the numbers
before. But, like I said, we had already picked all that low hanging fruit. So,
5% at that point was major, because we went a month or two where we couldn't get
the conversion rate up at all.


That was a pretty big win for us. Like I said, it was on an already optimized
page. You can see the two forms here. One is obviously much simpler to fill out,
or it looks like it is.


Here are some best practices I want to share with you guys. The first is start
simple. You don't have to make this complicated. My motto in any engineering
scenario at all is start simple and test up. You don't have to make a really
fancy user experience. You don't have to make it all Ajaxy when you launch.


Just get something out there and get it into production, because done is
better than perfect. Then, since you're in production so much earlier you can
start experimenting. Each feature that you roll out you'll know what affect that
has on the conversion rate because you can test it.


The second is always have a test running. If you have traffic coming to your
site, which you probably do right now, and you're not running a test that's just
wasted potential right there. Because you're not learning from the people that
are going to your site. Always have a test running.


The third is don't be afraid to fail. I can't stress this one enough. I can't
actually remember the numbers, but I want to say something like only 20% of our
experiments on the campaign actually raised the conversion rate. A lot of them
were a statistical tie where it resulted in nothing. Some of them even decreased
the conversion rate. Those are pretty damaging psychologically, but you can't
let that get you down.


I want to show you an example of this. Ignore the amount buttons. This is a
bad screen shot. I don't know how this came about. But, everything was the same
except for a little check box down there that says 'Save my payment information
on the variation'.


Somebody had the idea to instead of ask the follow up screen to save your
payment information we wanted to put it on the donate page. Because they thought
maybe that would increase the conversion rate on saving people's payment
information. Well, this slide is a little out of order.


That actually reduced our conversion rate by 44%. Right when we saw that we
stopped the experiment immediately and just moved on. That's the whole thing
about testing. It's not permanent. You can just move on. You might not even have
thought that that would result in that. I'll go back to this side. If you aren't
failing then you aren't testing enough, because you're not going to have 100%
success in your tests. It's just not possible.


The second area of optimization I want to talk about quick is data gathering.
You really can not gather enough data. That's really my motto.


We on the campaign just gathered any kind of data that we could think of -
error rates on forms, when people focused in the forms, and how long it took
people to submit the form. And, how long it took for our Ajax response when the
user hit submit to get a response from the server so that we could tell the back
end engineers how long it's taking. Because we want it to be faster, obviously.
Anything we could think of we measured it.


Again, here's this number. We did over 668 million Google Analytics custom
events. Here's an example of one. This is an interactive infographic that we put
out to showcase our 1 million donors. It was pretty early on in the campaign. It
has a lot of little pieces of interactive content there where you can scroll to
see names, what are the most popular names people donated under, and where
people are from.


One part of that is this little piece right here which you can just scroll
through and see the most popular names. We put Google Analytics custom events on
the left arrow on the right arrow, and we found that 82% of the clicks were to
the right arrow. So, that left arrow was unnecessary, and it's just cluttering
the UI and gives the user more options. You obviously want to be guiding the
user through what you're presenting to them.


We used that learning to optimize our UI's further down the road, and we just
didn't put left arrows on anything, because it doesn't really make sense. This
is the Google Analytics custom event to track that data. It's super simple and
it's arbitrary. The category is one million infographic. The label is name
slides. Super simple.


The last area that I want to talk about is user testing. This is actually a
really cool example, because it solved a problem that I don't think that we were
going to be able to solve without user testing.


This is the last step in the donation process. This is where we're asking for
your employer and occupation. This is required of us by the Federal Election
Commission. So, there's no choice. We had to gather this information.


Well, when we put the error tracking on our donate form we found out that the
two most common errors behind people entering their credit card information was
employer and occupation. We were like,'Wow, that's really weird. How can that be
such a hard question?'


We went through and looked at the data people were submitting. It was like,
'None of your business', 'F you'. People just aren't comfortable, right. So,
that was that. There's nothing we can do to make people more comfortable,
really.


So, we just left it at that until we started doing user testing. We took a lot
of the volunteers that came into headquarters. There was a ton of them. There
were students, there were retired people, and all kinds of age ranges.


We sat them down on the computer on Silverback, and we asked them to make a
donation. Sorry, I'm cheating a little bit. We found out that the students and
the retired people did not know what to put in there. Because they're not
employed.


Again, this is us thinking as us as the users. We work for the campaign. 'I
know where I work. I work for Obama for America.' That's a very simple question
for me.


But, to a retired person it's like, 'What do I put in there?' So, they don't
put anything, and then they hit submit and that triggers the error. That's why
the error rate was going up so high on these forms.


Once we got that feedback from user testing and observing our users use our
product we put a little tiny - and I don't know if you guys can see it but it's
just a little tiny line that says 'If you are retired please enter "retired" in
both fields'. Little tiny bit of copy. It did not take us a long time to put
that in there.


Adding that field hint in reduced the error rate by 63%. That's just crazy.
Like I said, we would not have known to test that beforehand if we weren't doing
user testing and watching our users.


I blog about all of this stuff a lot on my personal website. It's
kylerush.net. I go into a lot more in depth on the technical side and a lot more
experiments if you want to check that out.


That's all I have for you guys. Thank you.


Cyrus Shepard (emcee): Let's step over here under the light...


Kyle: ...You want this?


Cyrus: Awesome work, man.


Kyle: Thank you.


Cyrus: I assume you're using the enterprise version of Google Analytics.


Kyle: Is there an enterprise version?


Cyrus: Yeah, yeah.


Kyle: I know that we had a direct line over there where we were like 'Hey our
stuff's not loading, can you please do something?' They were, like, 'Refresh it
because there was too much dataâ'


Cyrus: Yes, yes...


Kyle: ...It was a lot going on.


Cyrus: One question I did want to ask. For your testing platform, did you
build that yourself, or did you use an off the shelf version?


Kyle: No, we used Optimizely.


Cyrus: You used Optimizely.


Kyle: Yeah, which is awesome...


Cyrus: ...And, you'd recommend it?


Kyle: If you guys aren't using that, use Optimizely. It's amazing.


Cyrus: Yes, question?


Amanda: Is this on? There we go. Hi, my name's Amanda Stevens. I'm from
marketing agency in Winnipeg, Canada. Fantastic presentation. My question for
you is you talked a little bit about the design elements and the UX changes you
made to the website to add that lift. I'm just wondering if you can expand on
some other design elements that you incorporated to increase conversions.


Kyle: Yeah, sure. I don't want to be too harsh on design, but in my experience
what we tested on design, embellishments and stuff, is just kind of a waste of
time. It's fine if the designers want to put that in there. That's great.


But, like I said, when you're testing, like, button colors, and rounded corner
versus square corner, do not waste your time with that. That's not going to do
anything. It's just going to sink. It's a time sink.


Really, when it comes to design, our brand was all about imagery and photos.
That's where we got the real big increases in design changes is imagery. Other
than that, I wouldn't say that we found anything as far as design goes that had
a real impact on the conversion rate.


Amanda: Cool. Thank you.


Kyle: Yeah.


Cyrus: Yes.


Alan: Hi, I'm Alan. I'm with Three Ventures Technology and Agency. I actually
watched Dan speak at an analytics conference in San Francisco. One of the things
that I think I actually would like to ask you about is why Optimizely and not
Google Analytics content experiments with the multi arm banded approach, and
basically minimizing the time increasing a certain conversion rate at 95%
probability. So, I mean the amount of time basically that it would take for an
A/B test to finish at those rates.


Kyle: Yeah, sure. I can talk about this forever, but I'm going to make it
really brief. If you're an engineer there's really no other option for you.
Because Optimizely makes your life so, so easy.


All it is is running JavaScript on top of your page. When you can do that you
just add CSS classes to the page and it changes the design. It's so easy.


We actually were tasked with finding other A/B testing platforms that were
either cheaper or I don't know what the situation was. We evaluated a lot. I
don't want to dump on other platforms, because every one has its use. But, for
us on the campaign Optimizely was by far the best.


One of the problems with Google Analytics is the data's not live. Optimizely
gives you a live reporting on the results. So, you can see right away if your
experiment is dragging your conversion rate through the dirt and you can stop
the experiment.


It also gives you a lot of customization. You can do really advanced
targeting. You can target people based on a cookie. You can target people based
on their region. It has, like, a JavaScript expression.


There's nothing that we couldn't do in Optimizely. Any idea that we came up
with we could do in Optimizely. We tried it in other platforms. There were a lot
of limitations. From an engineering perspective that's why Optimizely is great.
That's mainly why we chose to go with it.


Alan: Cool, awesome. Thank you.


Kyle: Yeah.


Cyrus: And, I think we have time for one more. We'll go over here.


Q: Okay, so I work in fundraising. Most of the time the relationships that
we're dealing with in terms of how long a person is going to donate is five or
ten years, longer if we're talking about direct mail. So, it seems like a lot of
what you were looking at is immediate return. I don't know if you had an LTV
where you were saying we got a 60% increase in conversions, but it affected the
LTV or even just the length of the relationship by X. Did you look at things
like that?


Kyle: Yeah, we did. I would say it's very difficult to measure something like
that, because it's not like an exact, like the user's on the page clicking
something. But, if you think about it, we've been raising money, not me
personally but the campaign, since 2007. So, there is a long term donation cycle
there.


The campaign is actually still raising money now. They have an organization
called Organizing for Action that exists to support the President's legislative
agenda. They're still raising money.


I would say that in a political campaign where it's so crazy and there's a
deadline that is election day, which usually people do not have to deal with,
it's more about the short term. But, they are still doing long term stuff. We
just didn't have to worry about that as much because it was November 7, that's
the day.


Q: Okay, thank you.


Cyrus: Kyle, thank you so much for coming to Seattle.


Want more? Kyle's coming back for MozCon 2014, and you can buy your MozCon
2014 ticket today and save $400.


Can't wait? Get a front-row experience for all 37 sessions, plus their slide
decks, with the 2013 MozCon Video Bundle. Moz Analytics Subscribers, you get
$100 discount. $399 regular price - $100 subscriber discount = $299 for the
entire video bundle!



Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten
hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think
of it as your exclusive digest of stuff you don't have time to hunt down but
want to read!



You may view the latest post at
http://feedproxy.google.com/~r/seomoz/~3/DcTaS1uyD6E/kyle-rush-reveals-how-the-obama-campaign-broke-every-online-fundraising-record-free-mozcon-video

You received this e-mail because you asked to be notified when new updates are
posted.
Best regards,
Build Great Backlinks
peter.clarke@designed-for-success.com

No comments:

Post a Comment