Blog
Princeton, New Jersey, USA
Saturday 11th April 2015

Data.  Information.  It’s the new black, the game-changer, the must have for every organisation in the sports industry.  It’s everywhere.  


It’s available for the front office to optimise the player recruitment process, the fan to enhance participation in the game experience & the performance departments to monitor just about every biometric marker under the sun.  In fact, the revolution of big data has just about blown every other fad, phase or innovation out of the water.


You would think that given the focus that has shone on the tech revolution in sport, that analytics & metrics are new approaches that never occurred to anyone before the GPS device arrived on the market.  


Yet even before the advent of the heart rate monitor, we used Borg Scales to assess muscle soreness, RPE (rating of perceived exertion) scores to gauge how hard sessions were, VAS (visual analogue scale) scores to guide how much something hurt, questionnaires to evaluate how rested or recovered players were & smiley face scales to help shine a light on their readiness to perform.


In fact, these simple, easy to administer, quick to analyse scales, were (& still are) more valid & more reliable than some of the technological toys we (over)rely on today.


I’ve visited teams whose conditioning staff have told me, with a straight face, that their warm up had “a greater fatigue index than the high intensity session that just followed it”.  Really?!?  “Well, just look at the Catapult data!!”  


It does seem that when adopting GPS systems, some people don’t feel they have room for retaining a healthy dose of common sense.  


If you want to double check your interpretation or the accuracy of the data, just do a simple RPE questionnaire as the players walk past you from the pitch, breathing heavily, heavy legged & dripping in sweat.  Now reconsider the revelation that just spilt from your lips.


In fact, inhale deeply now…there is a report from Australia that is about to be published, suggesting that the data most teams are using to guide their every waking decision (or at least so it seems), the Catapult data, has an error margin of 50-70%.  


Yep, take a moment to read that again.  This study is mooted to be reporting that Catapult data allegedly has an error margin of 50-70%.  Does that sound like data you would want to consider in isolation, without the need for any other form of monitoring?  To me that sounds a lot like a big red flag…to be polite & understated!


We have to be cognisant of the fact that in the grand scheme of things, much of this technology is still in its relative infancy.  Now I am still waiting to see the details of this study & as yet I don’t know whether the alleged Catapult error margins are a result of the data sampling speeds or limitations with data storage.  


These issues will no doubt be resolved in future iterations & I really hope they are, as I love the principles behind them & the idea that we can get the data that we are told we can with them, as long as it is valid & reliable.  But let’s not be blindly led by what the sales men tell us without first checking the data against our known standards.


My other issue with the “Data Revolution”, is that performance science departments in too many teams feel they have to collect reams of data, even if they don’t have the staff numbers to analyse it, act upon it to inform & change practice or really take the time to fully understand it.  


I’ve consulted for teams who took the time & effort of putting GPS units on all the players before training, sat someone in front of a computer during training & then took all the units back in again after training.  Yet, that same person couldn’t explain to me what most of the numbers meant, what the physiological impact was of the numbers he could explain & didn’t bother to report the data back to me (or anyone else) until I asked for it.  


In fact, I was told by another member of staff that the club President once used the GPS data to punish a goalkeeper for laziness because he had run the least distance in training according to the technology.  No kidding?  Well, there’s no question of invalid, unreliable data on that count then.  After someone managed to quietly whisper some common sense in the President’s ear, the GPS data had never been mentioned again.


So why collect the data?  Well, in this instance, the President had bought the technology because the big teams in England were using it & the sports science staff didn’t want to admit they didn’t know what to do with it for fear of getting sacked & replaced.  


This by far & away isn’t an exception.  I visited a global leader in world sport a few months ago & was told of several, very impressive tools the team were using to collect a library of data to inform the science & medical services staff whether or not players were fit to train.  


I left impressed…until I met a member of their team for a coffee later in the week, who confessed they actually did very little of what they talked about because it was too time consuming & the data wasn’t reliable enough but as long as the owners thought they were collecting the data, everyone was happy.  


I can understand the smoke & mirrors, the illusions & the feeling of the need to please the person that pays your wages at the end of the month but if you’re in that situation, there’s something wrong with your team’s culture & focus on performance somewhere along the line.  


It’s not the systems for collecting the data that are broken, it’s the ethos & trust in the staff, along with the decisions to under-invest in the human resources (both in terms of numbers of staff & their education) at the expense of employing the technology that need repairing.  


It appears that for some owners, general managers or sporting directors, having greater numbers of more competent sports science staff just isn’t as sexy as having the latest toys that the Jones’ have.  


When I visit teams, I often get asked what data systems I use & which are the best.  In my opinion, the answer isn’t that straight forward & involves a lot of reflection & internal contemplation by the organisation before looking outwards for the most appropriate answer.


In fact I have two rules that I must satisfy before introducing data collection tools:


1) If you aren’t going to use the initial data or get chance to retest where necessary, then don’t bother collecting it in the first place.  

Collecting data is time extensive, demanding both effort & compliance from players & coaches who have a low tolerance for disruption, so if your data isn’t going to be used to inform your practice, don’t take the liberty of taking these sacrifices for granted.
  
2) Don’t let data rule you.  Data is there to answer questions we may have & to ground the assessments we make in objectivity.

Data is not there to try & answer questions we don’t have or complicate our performance strategies for the sake of trying to be clever & being a slave to the sea of information some of the technology is spewing out.



I usually qualify my answer with the statement that “if you have a decent ratio of well-educated, experienced science & medical services staff to players, with budget left over, then let those same staff identify the priorities in terms of addressing the current performance strategy & that is the area in which to invest in technology & data systems.”


Please understand, the purpose of this article is not to bash technology & data.  Far from it.  The value of valid, reliable & objective data is huge in guiding & evaluating the work we do to ensure the athletes we are responsible for training are in the best shape.  


However, I do feel it is necessary to draw attention to the misplaced reliance on data & the inappropriate prioritisation of investing in technology above & beyond the correct numbers of skilled, knowledgable & educated science staff.


The best staff will always get results, objective results that can support the coaches with relevant information, enabling them to optimise their preparation, training loads & recovery timings.  


Even the best computers are worthless without a smart person to collect, analyse & interpret the data they produce.



blog comments powered by Disqus