New round of beta testing

12 09 2008

Since starting my sales initiative that I posted about earlier this week I’ve been thinking about ways I could expand the reach of ClimbPoint and increase the size of my market.

I’ve known for awhile that ClimbPoint would probably work really well in community recreation centers but haven’t yet tested it in any of those facilities.  So, starting this week I’ve begun contacting community rec centers with climbing walls about participating in a no obligation beta test of ClimbPoint.

The general terms of the beta test are below, and my primary concern is that I’m being too generous and should be charging some kind of fee to the testers up front.  My rationale for keeping everything no-cost/low commitment is that I need users to attest that the software is in fact awesome and super easy to use.  I also need feedback on what works well and what doesn’t.

The terms

  • I’ll send out a full version of the latest release of ClimbPoint. Beta testers agree to install the program and use it to keep track of climbers at their facility for three months.
  • Testers agree to record their thoughts and suggestions for the software, and every 2-3 weeks we’ll have a brief phone conversation so I can understand how well ClimbPoint fits into the way they work there. As a result of our conversations, I may send out updated versions of the software to install and use at the wall.
  • At the end of the three month testing period Testers are free to continue using the version of ClimbPoint that is currently installed, with no requirement to purchase a full license. They’ll have the option of purchasing a full license at a 20% discount. Purchasing a license will allow them to receive software updates (which add new features), and receive email support.

Well, am I being too generous?  Should I charge a small fee up front for participation?  Will anyone respond?  Thoughts and suggestions are welcome.





Feedback from my beta testers

11 03 2008

A couple weeks ago I spoke with five climbing wall managers who have been beta testing ClimbPoint over the last few months. I was able to get their feedback on the software, and I am stoked about the response I got. To me the feedback seems to indicate that I’ve succeeded in creating fans of my climbing wall software. The feedback was largely positive, and I got some great ideas for new features. Below are the aspects of the software that my testers were most enthusiastic about:

Easy to use
Installation at all sites was a snap, and “super easy” in the words of Robert Taylor from Clemson. It was great to hear Robert describe how, once he finally got a card reader from the IT department, he was able to plug it in and it “just worked”. Here are a few other quotes:

One of the most intuitive and user friendly programs on the market” – Mark Lattin, University of Kentucky
“The main thing I like is its simplicity” – Mike Maxam, Miami University
“It’s really easy to use” – Lynette Bowsher, Indiana Wesleyan University
“I like it…it’s fairly streamlined” – Jerome Gabriel, Bowling Green State University

Idiot proof
I’ve mentioned this aspect before as a selling point, and there are a couple priceless quotes on this topic:

“[It’s] pretty streamlined…there’s very little margin for error” – Mike
“It has done wonders in terms of eliminating our administrative errors…You can’t get rid of all the dumb mistakes people make, but this does a pretty good job” – Mark

A complete solution
Because I have so many great ideas for additional features of the software, it’s hard for me to describe ClimbPoint as a total package at this point. My testers, however, found that it met their needs for daily operations:

It has streamlined everything we do at the wall” – Lynette
“All of the functions you need to run a wall are there” – Jerome

Awesome all around
My terrific testers also mentioned that they loved how their staff could now all be on the same page, and how the reporting features save them from running daily numbers on wall attendance. But one of the best quotes came from Lynette, when she said “I can’t imagine not having it“.

Want to see what all the fuss is about? Check out the website or go download the free trial.





Moving toward an open beta

19 02 2008

Thus far in my development of ClimbPoint I have basically hand-picked my beta testers. Early on this served a few purposes. First, I wanted to give each tester some personal attention and work closely with them to gather their feedback. Second, I knew the software wasn’t quite polished yet, so I wanted to generate a little buzz…but not hype. Finally, I knew my initial market would be university climbing walls, so I wanted to be strategic in who I sent the software to.

Now that my first beta period is coming to a close, I’ve begun to open it up to a few more schools — most recently Clemson University — and I’m considering opening it up to commercial climbing gyms. This is somewhat of a diversion from my initial plan, as I hadn’t anticipated that the initial version of the software would actually be useful for a commercial gym.

However, after a few conversations with the owner of RedPoint Indoor Climbing, I think that ClimbPoint might be useful for climbing wall management provided that it’s used in conjunction with something like QuickBooks POS.

To prepare for this expansion of the beta program, I whipped up a 60-day trial of ClimbPoint which I’m making available on a limited basis. I don’t plan to post the program for download until May, but for the time being any interested parties can request a trial copy by sending a friendly email to contact me





ClimbPoint 0.6 released

29 01 2008

Yesterday I sent out the latest release of ClimbPoint to my beta testers, complete with a few brand new reporting features. The new version now allows climbing wall managers to view and print reports that 1. detail the most active climbers and 2. summarize monthly wall activity, including the number of total visits, unique climbers, and registered users. While this latest release only shows monthly statistics, future releases will be able to compile hourly, daily, or weekly statistics in addition to the monthly summaries (which my friends tell me will be a huge help in managing their staff and hours of operation).

I also squashed a few bugs with this release, which are detailed along with the new features on the changelog (courtesy of FogBugz). Let me also take a minute and say thanks to the folks at FogBugz for creating an awesome tool for software project management and making it free to one and two person startup shops. This is the tool that has enabled me to keep track of all the great feedback that I’ve gotten from my beta testers.

Speaking of which, there are a whole boatload of great features that I can’t wait to get started on. In the meantime though, if you’d like more info on the software or want to get in on the next round of beta testing you can reach me at contact me





Visit to Indiana Wesleyan

10 01 2008

On Monday of this week I took a trip to Marion, Indiana to visit the climbing gym at Indiana Wesleyan University. IWU was one of the first sites that agreed to test the rock climbing software, and I picked them because they were one of a few sites within driving distance of Purdue.

It worked out for me to sit in on a climbing wall staff meeting while I was there, which made the experience that much more productive. Prior to my visit I had received most of the product feedback via email, and while there had been some good suggestions, I was itching to talk to a few users face to face.

The meeting turned out to be more productive than I could have imagined, and for me it was a huge confidence builder. The employees were stoked about the software (well, as stoked as anyone could be about software), and after I got the ball rolling with a few questions, the suggestions started pouring in.

Some of the cooler ideas that we kicked around were integration with Facebook, avatars or pictures for each climber, and a climber rating system based on experience. We also talked about a feature to organize and score competitions, and hashed out the wall activity reports that I mentioned earlier.

One of the many priceless quotes from the meeting came after most of the above suggestions had been given.

Me: Is there anything else you wish the software could do?
…silence…
Employee: Can it belay someone?

Truthfully, they were happy with the software…which made me happy. I’m looking forward to releasing the next version in a couple weeks, and am planning to stay in touch with the staff there. Marion is a relatively short drive from Purdue, so I’ll likely drop in again sometime in the next few months.





A real beta test

20 12 2007

As a geek I’m always on the lookout for little tools and gadgets that can make my life easier. It was in that spirit that I signed up for the Twine beta a couple weeks ago. Now, I still can’t tell you what exactly Twine does, but I was pleased to get this response a few days ago:

“We are doing a real beta test – not just slapping the word ‘beta’ on a released product. The true purpose of a beta test is to work closely with beta testers to gather feedback and improve a product before general release. Our plan is to do several waves of small tests, to gather first-round feedback and improve our service, before we begin to roll out to a wider audience.”

I think they’ve got the purpose of a beta test pretty well nailed, and I’ll be interested to see how closely they do work with their beta testers to gather feedback (assuming that I’m in on the beta). From my experience, working with beta testers can be beneficial, but it requires some work to let people know that you really do want to hear from them.

I’d like to be better about letting my beta testers know that I care about them, and I’m open to ideas on how to do that. Up to this point I’ve contacted them via phone and email on a semi-regular basis (every couple weeks), but haven’t created a website, newsletter, or anything else more formal to keep them up to date on how the product is evolving.





Are all beta testers early adopters?

1 12 2007

Last month when I settled on five beta testers I was unsure of what to expect in terms of participation. After all, four of the first five sites that I contacted were interested in trying out ClimbPoint. Were they just being nice, or were they genuinely interested in trying out the software? I figured out of five testing sites I might get two that would provide helpful, consistent feedback. Truth be told, I was a little uneasy that if all of them were totally on board I might be in over my head.

Well, so far it’s turned out that my intuition was right…mostly. Two sites have installed ClimbPoint and are using it for day-to-day operations. A third is planning on switching over after the Fall semester is over, and the other two…well who knows. I’m optimistic that all of the sites will eventually begin using the software consistently, but the relative success of the beta program so far (especially given the number of sites I initially contacted) has me thinking about the broader market.

Will everyone be this receptive?

I’ve gone back and forth on this question many times, and could argue either way. There are reasons that those who participate in beta tests wouldn’t actually purchase the software — namely because beta testing is free, low-risk, and low-commitment, but also because some people just like to wait to adopt new technology. The technology adoption lifecycle provides a good illustration of this point: there are early adopters, the early and late majority, and laggards in adopting any idea. Some people just need to be convinced, and some just need to see others using a product before they try it out.

Technology adoption curve

Beta tester != Early adopter

I’ll pause here to mention that I consider beta testers and early adopters as separate groups, because early adopters actually pay money for a product while beta testers get the product for free…hence my question in the title. I do think, though that only the first three adoption groups are likely to become beta testers — agreeing to beta test does indicate that there is more than a slight chance that a purchase will be made.

Anyway, given the adoption curve I can see why I might have some trouble selling this product to everyone right away. It takes time to convince the early majority to bite, and the late majority won’t buy until they know quite a few other people who are already using the product.

The bottom line

I keep coming back to the fact that four of five sites were interested, and that half of those have jumped in and given really positive reviews so far. Could it be that the type of people who manage climbing walls are also the type of people who tend to be early adopters? For now I’m trying to temper my optimism, if only for a month or two. In a couple months I hope to have version 1.0 finished, and then we’ll see how many sites put their money where their mouth is ;).

I’ll be interested to find out if the adoption curve really holds up and I get about a 10% response rate. I sort of have a feeling it will…