First of all, thank you to everyone who’s been completing the monthly surveys. They’re incredibly valuable and give a much clearer picture of how members feel about the state of the course. If the only voices we hear are of those who have something to complain about – however legitimate – it’s all too easy to change something that the great majority rather liked. The survey tries to ensure that all voices are heard equally.

For those interested, the first six questions – on greens, fairways, bunkers, rough, overall course and ‘pride’ in Kilspindie – are all Net Promoter Score (NPS) questions, or as near as we can get to them. You might not have heard of NPS, but if you’ve ever visited a supermarket or bought something online, you’ll have seen its classic question: “How likely are you to recommend XXXXXX to a friend or relative?”

What you can’t do with the answers to such questions is average them. If they’re generally pleased, most respondents are generous with their praise, giving 9s and 10s, if they think things are ‘fine’ they’ll head for around 7, but anything of 5 or less means ‘dissatisfied’.

However that’s about as accurate as we can get. At Kilspindie, the Board chose to identify 8-10 as ‘Good’, 6-7 as ‘Fine’ and 1-5 as ‘Unsatisfactory’. But some people never give 10 at all as a matter of principle whereas others want to praise if at all possible, so in one sense an 8 from one respondent can be worth more than a 9 from another. Likewise, at the bottom end, if two members want to complain about bumpy greens, Mr Grumpy might give 1 (especially if he or she has just had seven three-putts) while Mr Nice might give a 5. So you’ll see that averaging wouldn’t work because Mr Grumpy’s score skews everything.

NPS deals with that by treating all the responses in the same category as identical. So we take the percentage of Good responses (8s, 9s & 10s) added together, ignore the OKs (6s & 7s) and subtract the percentage of the Unhappy ones (1-5s). That produces a score which is either positive or negative. The great thing is that we can then track how attitudes change month by month, which aspects of the course satisfy most, and which ones the members would like the greens staff to look at. When we have enough data to show you, we’ll let you see a wee graph.

Unsurprisingly, members are least satisfied with the rough and bunkers, although it’s worth adding that very few golfers complain about bunkers and rough if they’re never in them. Surprisingly, the biggest single factor in ‘customer satisfaction’ is the weather. If there’s a long sunny dry spell, members are delighted. But when the rain falls, the greens slow up, the ball doesn’t run as far on the fairway, the rough thickens and even the sand gets pounded down so members complain that there’s no sand in the bunkers! Quite a bit is therefore out of the hands of the greenkeeping staff, although clearly they try to limit the damage.

Finally, one respondent (remember they’re all anonymous) has asked about question 6, on “pride in Kilspindie”. Questions such as ‘how likely are you to recommend Kilspindie to your friends or relatives’ don’t work, partly because it makes little sense anyway and partly because some respondents might think we might be looking for more visitors. But question 6 asks the question in a similar way and acts as a ‘control’, allowing us to compare and check the validity of every other answer. It’s actually a fairly standard form of question in surveys and a vital component of our Kilspindie ones.