Miscellany‎ > ‎

2018 snapshot

2018 Sunday snapshot - how are we doing?


Ringers at 440 towers shared details of how their Sunday service ringing went on 7 January 2018. And while this is no more than a snapshot (see below), the results suggest that it’s not all doom and gloom - but we could be doing better.


  • Almost half of responding towers (49.77%) rang all their bells

  • More than a third (36.36%) rang some of their bells

  • Less than a tenth (9.77%) of reporting towers didn’t ring at all

  • 18 towers (the balance) gave answers that don’t fit this analysis


  • Compared to last year, things are more or less staying constant. More than three fifths of towers (64.05%) rang the same number of bells as on the same Sunday in 2017. 50 towers rang more than last year: 54 rang less than last year.


  • 1 tower with three or less bells responded. 52 towers with 12 (or more) responded and the largest number of responses (204, 46.58%) came from eight bell towers


How does this compare to the 2017 snapshot?


In January 2017, I asked some similar questions for Sunday 8 January 2017. But I only asked ringers on Facebook, and used rather different questions. 186 towers responded. So it’s hard to make direct comparisons - and (see below about why this is no more than a snapshot) I may not even have asked about comparable days.


Given these, I can only say tentatively that things are perhaps a little worse this year

  • In January 2017, 59.79% of towers had enough ringers - or more than enough ringers - to ring all their bells. In 2018, 49.77% of towers rang all their bells

  • In January 2017, 40.21% of towers didn’t have enough ringers for all their bells. In 2018, 47.13% rang some or none of their bells.



Just a snapshot - why nothing here should be taken too seriously


With 440 results (something over 6% of the total number of towers listed in Dove) it might be tempting to say that there are enough answers to start drawing some statistically significant conclusions. But I would say that we probably can’t do this.


There are a number of reasons for this - not least of which is that I’m no statistician and have no real idea of how to design a survey. But other reasons for not wanting to rely too much on what we have here include

  • The respondents are self-selected. This is a problem in two ways: if no one in a tower uses the facebook groups and email lists that I used to publicise the survey, then no one will respond, and (even of those people who did get to hear about the survey) if ringing didn’t go too well on the survey week, it’s tempting not to report it - whereas if the tower was bursting at the seams with ringers, all ringing fantastic things, you’ll (rightly) want to shout about it.

  • By definition, if no-one is ringing at a tower, no one is able to report back on what ringing didn’t happen. So positive responses are over reported

  • I haven’t been quite as clever in finding a ‘memorable’ week as I thought: last year, the first Sunday after the New Year bank holiday break was the second Sunday in January; this year it was the first Sunday (and Epiphany). So service patterns to ring for may not have been the same.


And the survey is limited. This is partly because I wanted to make it quick and easy to do, but also because asking about the quality of ringing is quite difficult - what I think is excellent ringing may be ringing that has you wondering if it’s even worth mentioning to people that the standard could be better.


So this is nothing more than a snapshot. Like anything else, it gives us some idea of what is happening - but it’s not enough to rely on.


A summary of responses is in the pdf below


Ċ
Giles Blundell,
5 Feb 2018, 06:45
Comments