Wednesday, October 17, 2007

Email Diva? hmm, stick with email

I was just handed an article written by the Email Diva (it was hard copy, otherwise I would have the link). Summarizing, the author stated that since there was no standardization in email metrics (citing EEC Whitepaper), one should seek out benchmarks from their Email Service Provider, or Marketing Sherpa, which is close to apples to apples. However, because of non-standardization and other issues, “Comparing your results to industry standards will never tell you whether the effort is worthwhile for your company….The only standard is: did I make money/was I able to acquire new customers at an acceptable cost?”

Well, in regard to the standardization issue, I whole-heartedly agree that currently there is a problem. Going to your provider is a great option. Indices such as The Bulldog Index ensure everything is calculated and treated the same. However, I do not agree that the Marketing Sherpa Guide is close to apples to apples. I think it is a good guide, and important, but by its very nature, it is a survey, and therefore wrought with non-standardization. Again, another reason for the indexes like the Bulldog Index!

The main concern I have is the statement of no reason to use Industry standards (even when standardized). The only standard is money? What IS an acceptable cost? An Industry benchmark helps you decide what your standard SHOULD be and helps you compare yourself to competitors. If your CPL is $35.00 one month and $33.00 the next, great, you improved, but if your industry average is $25.00 you have a lot of work to do, and your standard needs to be improved, otherwise you are losing out to your competitors. The landscape changes dramatically if your industry CPL standard is $45.00. They you can make a decision of, continual improvement on what you are currently doing, or taking resources and going after something else.

Friday, October 12, 2007

Right is Right

An individual once came into my office after a particularly upbeat argument about statistics and wrote on my board “Right is Right.” I kept that on my board until I moved out of the office. For a while, I thought, yes, I must be that theorist, because I am right, I know theory. It is my job to remain strong in theory. I am not so sure about that anymore.

Basically, sometimes you will have a person on one side trying to argue that conservative, statistical route and the other side; you have a person explaining that you are thinking too analytical and need to focus on the overall goal. In the end, both are right and both are needed. You need a “theory guy” to ensure that what is being done is following the correct models and assumptions. However, a lot of time that theorist can be too involved in theory, and not involved in enough of delivery. That’s where the other person comes into play. The “Strategy” guy. While it is the theorists’ job to bring the right assumptions to the table, it is the strategists’ job to bring the theorist more into the real world. If this can be done well, it can be a great synergy.

Look, it is about being right, statistically. Because if it is not, then no model will work. But it is also about delivery and getting things done. Sometimes you don’t have the correct data to make the perfect model, and if you wait too long, you and your company looses out. It is the organization that has a good synergy between the two that will be the most successful!

Wednesday, October 3, 2007

Data from Multiple Sources

I was just working on a presentation for Innotech next week and I got to thinking about something that has always concerned me. Technology has afforded us the ability to capture droves of data, and has also given us a lot of more user-friendly software in which to analyze the data. Some of this is really good for us, and some of it is bad for us. In the wrong hands, bad data and assumptions can bring a company to it's knees pretty quickly. In order to choose the correct model, one must know what the model assumption are. I have seen many analyses completed on data that do not follow the correct assumptions. Unfortunately, software now available compounds this. In the good old days, one really had to at least understand the make-up of data in order to run an analysis. It wouldn't stop anyone from doing the wrong thing, but it was a decent barrier. Now, one can just push and pull data through systems without knowing too much if what they are doing is really right or not. Some software systems have developed barriers, but this still does not stop some weird things (I was once asked why a software system would not allow him to do a multiple regression with over 500 variables!). Am I saying everyone needs to be a statistician. Well, no.... But what I am saying is, if you don't know some of the basics, beware of your results.

Tuesday, October 2, 2007

Probability of Eye Injuries

Wow,

OK, sorry everyone. I am back. I had a little mishap. I was running and I actually got stung in the eye with a wasp. Now, I was thinking, what is the probability of THAT happening. So I tried to do a little research on wasp stings and the likelihood of getting stung in the eye. Unfortunately, there is little information out there that can be used. I did find that there was about 9K fireworks related mishaps a year, and 30% of these affecting the eye. Hm, little curious to find out where in the country this happens the most! I also found out that there were about 42,286 work related injuries to the face in 2002 and 70% involved the eye. Dang! Unless I was working with bee keepers, that won't help me.

Do you ever feel like this when trying to calculate what seems to be a simple issue? You can't find the correct data, and you end up chasing the wrong information. Sometimes, in the case of the wasp sting, you may just have to cut your loses and try another day. Otherwise you can reach too hard for honey which turns out to be just jam.