Enhancing data informed decision-making in sport
This webinar was recorded on 4 November, 2020 from the Human Performance Research Centre. Sign up to receive notifications about future events.
Dr Mark Watsford:
Morning everyone, and welcome to what is a great spring day in Sydney. And we're in for a treat with our webinar today with some really special guests that we've got. I'd like to start with a welcome to country, and I'd just like to acknowledge the Gadigal people of the Eora Nation, upon whose ancestral lands our city campus stands. I'd also like to pay respect to the elders, both past and present, acknowledging them as the traditional custodians of knowledge for this land.
Dr Mark Watsford:
So, my name's Mark Watsford, and I'm one of the research staff here at UTS, and I'm the Deputy Head of School of the School of Sport, Exercise and Rehab. And it's my privilege to be able to introduce three data scientists today, and we're going to have a really good chat about some of the ins and outs of working with data. And it's really fortunate for us to be able to get some words from these guys who are often behind the scenes, we don't often hear about big name data scientists. But make no mistake, these are the guys who are informing the practice that goes on with coaches, with the players that we see on the field.
Dr Mark Watsford:
So, I'd like to introduce Dr Patrick Ward, Dr Andrew Novak, and Dr Tom Kempton. And Tom's from the Carlton Football Club in Australia in AFL. Andrew's from Rugby Australia, also working in Sydney. And Pat's over in Seattle working for the Seattle Seahawks in the NFL.
Dr Mark Watsford:
What we're going to do today is, we've got an hour to discuss a few really interesting and poignant issues in the world of performance analysis and data science. And part of UTS's mantra I suppose is to get involved with the community. This is our second webinar for the year, following one from Franco and Aaron halfway through the year. And we're really trying to just convey some really important information about applied sport science. And we're going to have the chance for you as the audience to post some questions to our panelists along the way, and you can do it in the chat, do it in the Q&A section, and we'll be able to hopefully field a few of those throughout.
Dr Mark Watsford:
I'm just going to throw to our panelists and let them introduce themselves in terms of what they've done and where they're at currently. And then we'll get into a bit more pointed discussion. So Tom, could you take it away with just a brief introduction for us?
Dr Tom Kempton:
Thanks, Mark. It's good to be here, and hello to everyone watching along. So, my name's Tom Kempton obviously. My background is I did UTS undergrad sport science quite a few years ago now. I went on and did an honors project through UTS and got a bit of an interest in research through that experience. And then went on to complete a PhD, again at UTS, under the supervision of Aaron Coutts. I was working embedded within professional rugby league throughout my PhD, and my thesis looked at factors that related to match performance in professional rugby league. And then once I'd finished my PhD I moved into AFL, and I've been fortunate enough to be the sport science manager at Carlton for the last five seasons.
Dr Mark Watsford:
Thanks, Tom. Andrew?
Dr Andrew Novak:
Yeah. So, I guess similar pathway. [inaudible 00:03:10] traditional sport science pathway. I did my undergrad at University of Newcastle, then moved on to an honors and did a PhD at Newcastle as well based on the Central Coast. So there I was studying psycho and physiology and performance, and then moving into mountain biking performance specifically. So, bringing athletes into the lab, capturing various aspects of their performance in the lab, and then taking them into the field and seeing how those measures actually translate to field-based performance for mountain bikers.
Dr Andrew Novak:
I guess through that process I became really interested in modeling data and modeling performance. And alongside that I was starting to do a lot of software programing, so developing iPad apps that we could use to capture some aspect of performance, whilst doing a little bit on the side of video game developing. So, software programing from that perspective, quite different to [inaudible 00:04:06] and other things that we might typically use as a data scientist.
Dr Andrew Novak:
And I guess coming out of that I ended up working for both Rugby Australia and Fusion Sport in a joint role. So, Fusion Sport runs Smartabase as their main product, which is an athlete management platform. And so, I got a lot of exposure to the types of data that are collected in team sport. Whereas through my PhD and honors I was working more with cycling, which is quite different.
Dr Andrew Novak:
So now, yeah, as Mark said, I'm working with Rugby Australia, but also tied in with UTS as a research fellow. At Rugby Australia I'm part of the central data analytics team, so based here at Moore Park, which I just support as many of the teams as we can. So, coaches, strength and conditioning, physios, medical. A couple of the guys are going into business analytics and supporting that side of the business as well. So, I guess mainly research projects, reporting, dashboards, those kind of things, for any stakeholders interested in data.
Dr Mark Watsford:
Yeah, thanks Andrew. And Pat?
Dr Patrick Ward:
Thanks, Mark. And Andrew and Tom and Blake for having me, it's great to be here. Yeah, my background is probably different than both of theirs. I was a defiant 18-year-old and my mom wanted me to be a doctor and that was a perfect reason for me to go to music school. So, my undergraduate was in jazz guitar. And then I graduated and realized you could only play so many $20 gigs before you have to pay rent.
Dr Patrick Ward:
So, I went back to school and I studied exercise physiology. And that brought me through a pathway to the Nike Research Lab where I worked for a few years, before going to the Seahawks as a sport science analyst, which I was the only one on the team, I think I was probably the only one in the NFL at the time. And then I worked up, and so now I'm the director of research and development. I did my PhD, my primary supervisor was Barry Drust, and I did that at Liverpool John Moores, it was in training demands in American football using accelerometers. And Aaron Coutts was one of my co-supervisors, so that's kind of the connection here.
Dr Mark Watsford:
All right. Thanks Pat. And if we get time you can get the guitar out and serenade us later on.
Dr Patrick Ward:
Yeah, we'll do it.
Dr Mark Watsford:
So, let's start. You guys are researchers at heart obviously, all got your PhD. Maybe Pat, can you just talk to us about just your overview and how you use research, and how you use frameworks in your approach to research in the work that you're doing?
Dr Patrick Ward:
Yeah. I mean I think... People often ask me, "Oh, should I get a PhD? And do you think it was valuable?" And things like that. And I can't say that getting a PhD put me in the job that I have now, not at all. I didn't need a PhD to have this job, more like luck and relationships. But I do think what it does do is it gives you an appreciation for the scientific process, which is a process that is rooted in understanding how to ask a question succinctly. And then put in place a methodology to answer that question.
Dr Patrick Ward:
And so, as we go through our working day and answer questions, whether the... we work across football operations, so everything from scouting and talent acquisition, to coaching and opponent analysis, to sport science and more of the traditional pieces of sport science that you would think about. I always try and think through the methodology of, "How am I going to answer this question? What is the question? And what type of analysis or what type of approach is going to be most appropriate to answer this question?" I think that's really the science piece that comes out of it. I think that's what...
Dr Patrick Ward:
I don't know what it's like necessarily in Australia. In America lots of people work for teams and they're called roles or called applied sport scientists, but they don't really have a scientific background. And so, because they don't have any of that training, you look at the things that they do or the technology that they use and the way that they apply it, and my thing is like, "Where's the science in this? You're not even..." There's nothing scientific about it, right? So I think to me, that's really what it's about. Is having a general framework for how we go about exploring a phenomenon and answering a question.
Dr Mark Watsford:
Okay. And if you want the right answer, you got to ask the right questions.
Dr Patrick Ward:
Yeah, absolutely.
Dr Mark Watsford:
Tom, have you got a different approach? Or is that the kind of approach you'd take as well?
Dr Tom Kempton:
I agree with Pat. I think he summed it up really nicely. Other than I think you also get an appreciation if you're able to read literature and scientific articles which can sometimes be pretty inaccessible. But learning to critically appraise information, and then you can make your own decisions, as Pat touched on, which technologies are valid, what should you use, could help you understand what we should use and what we shouldn't. Because in sport science and professional sports there's so many different products and things out there, and you need a pretty strong BS detector I think to work through which ones are actually useful and which ones are marketing. So, I think that's another thing that can come out of the research framework, is that ability to critically analyze what you're faced with.
Dr Mark Watsford:
Yeah. And there's the turmoil I suppose between the day-to-day and the week-to-week and then the year by year approach as well. So, there's what's often referred to as the fast and the slow approach, and you guys have... you juggle many things on a weekly basis. And hopefully, during this hour chat, we can drill down into what are the important things on a week by week, but also what are the year by year decisions.
Dr Mark Watsford:
Andrew, if I can throw over to you, just about something about some of the day-to-day metrics that you rely on. And how you go about identifying what the important things are to look at, do you rely on coaches to inform you? Have you got a bit of a free leash or a long leash to go and explore things that you want? Can you give us a bit of insight as to how you go about it?
Dr Andrew Novak:
Yeah, good question. I guess because I work across Rugby Australia and UTS, when I'm at UTS I'm kind of more involved in bigger research projects and perhaps looking at things that coaches may not necessarily be coming to us with on a day-to-day basis, so I've got time to explore those there. When I'm with Rugby Australia it's more of that fast approach I guess, where you've got day-to-day problems that need to be solved, questions that are being asked. And that's quite different, and I guess there I'm largely guided by coaches and [inaudible 00:10:57] coaches and those domain experts I think. That's really important in that space, is to be guided by them. And so, we do a lot of technical things, providing reports, but it's all really guided by the subjective expertise.
Dr Mark Watsford:
So, it's subjective leads it, but then it's objective data. Is that how it works for you as well, Pat? Or have you got freedom or are you quite constrained in how you have to go about your problem solving?
Dr Patrick Ward:
No one constrains me, I do what I want. No, yeah, I mean I think you always have to work within the confines of, a lot of the times, what people want to know about, the questions that they have. At the same time, we do have the freedom to explore different things that we want to explore.
Dr Patrick Ward:
Regarding the subjective piece that you just touched on. I'm probably a little bit different in terms of... I think a lot of people want to make sport about the rise of the machines and have algorithms explain everything. And I think there is some value in domain expertise, and it's how you bake that in. This is where I think about like modeling strategies that can combine subjective and objective information. Or modeling strategies that are more explicitly [inaudible 00:12:26] in that they can take information as a prior that maybe someone believes to be true, and that then can be updated with new information or new observed information.
Dr Patrick Ward:
The hard part in sport is that most of the people that we work for, coaches or scouts, managers, they have models in their head that they think are important. They have these heuristics, these rules of thumb, these things that they think are important, how they weight those things is entirely up to them. But in their mind they have this model of, "I know what a wide receiver looks like, and I know it when I see it, and he runs like this and he looks like this, and blah, blah, blah, blah, blah." And that's okay, we have models as well, right? We have mathematical models. But we have to combine those.
Dr Patrick Ward:
And the two things I'll say about that. The one thing that I always tell them is, it's like driving down the highway, and if you're looking to change to the other lane, you always look in your rear view mirror and you always look in your side view mirror. And if both mirrors are clear you go ahead and make that change. And if they're not, you pause for a second and try and see what's going on.
Dr Patrick Ward:
And so, if the rear view mirror is that heuristic, that rule of thumb model in their head of what they know the game looks like, then the side view mirror is just our mathematical models. And if both of those check out, we generally feel pretty good about the process around making a decision, so we go ahead and move and switch lanes. And if those two things don't check out, then we pause and I get into long conversations with our pro scouting director about, "Well, why do we think that is? And can we explain that?" Et cetera, et cetera.
Dr Patrick Ward:
The other thing I would say to that, that I think we have to do a better job in our field in terms of data collection, is hoping these individuals keep score of their heuristics and the things that they think are important, and the decisions they make, and the way that they arrived at those decisions. Because if we want to take what they're saying seriously, we have to know how much weight to assign to it. And if they are poor at making these decisions, or they have a lot of bias about the good decisions that they've made and they feel very good about those, and they conveniently forget the ones that they were incorrect about, we can't learn anything from that.
Dr Patrick Ward:
I think it was actually Gerd Gigerenzer who wrote a book called Gut Feelings, and it was a fantastic... it was like page-16 or 14, I'm pretty sure it was page-16 at the top of the page, he talks about if you want your rules of thumb and your gut instinct to be taken seriously, you need to keep score of how often you're right and how often you're wrong if you want any of us to take you seriously. And I think we can help a lot with that in terms of building tools that allow them to collect these things that are subjective, but in an objective manner.
Dr Mark Watsford:
My gut feeling is you've got a pretty good memory on board there, Pat, if you can remember specific pages and lines of text.
Dr Patrick Ward:
Only a few. If I highlight them I can remember them.
Dr Mark Watsford:
So Tom, in terms of some metrics and things, how do you know what to use or what not to use? What do you do? Do you keep score in terms of overall metrics? But how do you know whether something's reliable or valid? And do you care? And yeah, how do you know what to use, and equally importantly what not to use?
Dr Tom Kempton:
It's a bit hard to follow on from the beautiful mind just there, but I'll try and answer as best I can. But I think what it comes down to is you've always got to keep front-of-mind that it's not about the data or the metric, it's about the problem you're trying to solve. So, that's why we're here, we've got to keep focused on what the issue is or what the question is. And there's a lot of different ways to solve that. It might be intuition, or it might be experience or relationships. But our framework for understanding the world, us type of people, is usually data. So, that's what we apply and that's our response to the problem, so that's always got to be more important than the specific metric.
Dr Tom Kempton:
But as you come to the metric I think there's a lot of... and I'm not going to talk about any specific metrics. But I think there's a lot of just principles that are helpful to keep in mind. So the first thing is we've touched on, there's so much more data now coming into sport. So, just because we can measure something, doesn't mean we should or doesn't mean it's important. So, it's important I think to be able to distill from the 800 different GPS metrics, the two or three that we really want to rely on, and then educate our decision-makers on how to use well. So, it's important to reduce those methods that our research group and others have done, like PCAs and things like that, just to try and reduce the amount of data and the amount of metrics down to a couple that are really important.
Dr Tom Kempton:
I also think, yeah, this idea of understanding the validity and reliability of the metric, and how reliable is it, is there a lot of noise? Can you detect the signal out of it really clearly? With the US elections going on there's been some fascinating discussion about polls four years ago and how reliable they were, and how much noise there is and how much true signal is. And Nate Silver who works in that space is probably the godfather in that and wrote a terrific book about the signal and noise. I think a lot of those concepts are really important.
Dr Tom Kempton:
But in terms of what I wouldn't use, I guess it would be black box kind of metrics that you don't know where they were derived or what they're representing. And I think it's also... I believe you guys had Franco and Aaron speak in a previous webinar. But it's important to understand the theory and the framework behind a particular metric, and does it actually represent what it's meant to show? And I think the acute to chronic ratio is a good example of a contemporary issue about is this an appropriate metric to analyze the problem at hand? So yeah, that's probably a bit of an overview of how I think about the metrics that we employ generally, yeah.
Dr Mark Watsford:
So, it's one thing then to receive measures, and you've got to arrange it, you've got to process it, you've got to analyze it. Ultimately you need to be able to show it and display it and talk to coaches or athletes. So, let's just have a quick chat about dashboarding or other ways of visualizing and reporting data. Andrew, what's your general approach? How can you actually convey this information? Obviously a conversation is one way. But how do you actually go about planning the visualization and presentation of a dashboard?
Dr Andrew Novak:
Yeah. So I mean, there are general principles of visualization that you can follow, and various color schemes, you may not want to use a green and a red unless you're trying to show that something is good versus bad. We use a platform called Tableau, they've got a lot of great resources for a lot of this introduction to data, visualization, that kind of thing.
Dr Andrew Novak:
In terms of how [inaudible 00:19:27] data, we went down the path of doing some really complex visualizations in 3D, as I said, using video game engines and things like that, and some really cool stuff. But one of the issues that we had was that to deliver that to a coach, they've kind of got to be going to this external platform to actually use it.
Dr Andrew Novak:
And so, we've actually pulled back a little bit from there, and what we do now is provide as much data as we can within SportsCode. So, a lot of the processes that we have, generate files that go into SportsCode, and they might show physical and technical and tactical data within there, which is where the coach actually goes to for their data. I think the video to provide the context of what happened in a match is key, that's what the coach understands and it gives them a lot of context about the game.
Dr Andrew Novak:
And so, a lot of the time we just want to be supplementing additional information in there. For example, if a coach wants to know the speed players were running relative to their percent max velocity into a contact, or something like that, we can provide it alongside the video in there. So, we do a lot around that [inaudible 00:20:37] in addition to just dashboards for longitudinal recording and static match reports and things like that too.
Dr Mark Watsford:
So, rather than video being the measure, video supplements the measures that you're effectively using. Pat, is that a similar approach for you? Have you got a specific dashboard you use? Or is it just a unique week by week approach? How does it work for you?
Dr Patrick Ward:
Video is definitely the language that coaches tend to communicate in. But that's not all we use, of course. We build all of our stuff in-house. I generally develop in Shiny, in R. Shiny, so that's my preferred approach. So, we have websites, and they're tailored to basically what the specific question that the end-user needs to get.
Dr Patrick Ward:
For example, a question about the... Or a website that's directed at the opponent advance is going to have information about the upcoming opponent, the things that they do well relative to league average, the things that they don't do well. And some information about our match-up probabilities, like where we feel our best match-ups are, the ways that we think we can take advantage of the opposition. Or maybe the ways that we think the opposition can take advantage of us, and hey, let's try not to do these things.
Dr Patrick Ward:
So, we build a lot of that stuff in-house. That being said, I do think you need to know what it is the end-user is most comfortable with. Like we have some coaches that literally don't want any charts and graphs. Like they believe that I've done all the work, and they were like, "Dude, just give me the four bullet points."
Dr Patrick Ward:
We had another coach who is like a really senior coach putting together playbooks, and we presented for like three straight weeks to him. Every week we'd present the upcoming opponent. And he'd stopped us in like week four, and he was like, "Yeah, you're telling me there's red and green dots on the board, right?" And we're like, "Yeah." He's like, "I'm colorblind." And I'm like, "Holy crap, we've been going through for four weeks and you didn't even tell us that you couldn't see any of the context." So for him, it's like he just wants written word, just hit me with the bullet points, and then give me the game and play IDs so I can go watch the video and see what you mean.
Dr Patrick Ward:
So, you kind of have to know what it is that people want. A good book though if you're really into understanding dashboards and visualizations, is a book by a guy named Stephen Few, and the book is called Signal. And it's a really nice read just on how to present data. And then other things I would spend time with FiveThirtyEight, ESPN.com, I think they do a great job of presenting data in a very visually appealing way. I kind of derive a lot of inspiration from those. But yeah, knowing what the end-user is most comfortable with. I was always told to have these amazing dashboards, and the end-user sometimes is like, "Dude, I just want you to tell me what it says, so."
Dr Mark Watsford:
It's interesting, because the bright lights would probably entice people to go and develop some really fancy looking thing. But yeah, at the heart of it is what do they want? And in what format? And yeah, I think people need to really respond accordingly.
Dr Mark Watsford:
So Tom, I'm going to just ask you now, away from the dashboarding, moving into more like a... not a what do you do, but the application across a year of what a data scientist or performance analyst might do. Do you in your role help out with recruiting? And how much? And what sort of scope do you have in terms of the yearly cycle? So, away from a weekly cycle where you might be doing some opposition scouting and things. In terms of the yearly approach, what are some things that you're involved with?
Dr Tom Kempton:
Yeah, that's a good question. So, it's really evolved over the five years that I've been in AFL. And I came in as a sport scientist from... that was my background. And I guess year round improving performance on the field and providing scientific advice on training, preparation, recovery, all those kind of things. And then I guess as it's evolved along with the increase of data that we're getting into clubs, and so I think in a lot of pro teams they might not necessarily hire a data scientist. But somebody like a sport scientist who is familiar with numbers and understands [inaudible 00:25:21] practice and statistic analysis and things like that, they actually almost become by default the person charged with setting up these data management streams and helping across the whole football department, how to manage the data and use it to make good decisions.
Dr Tom Kempton:
And Patrick and I and another colleague wrote an editorial a couple of years ago. Kind of tried to capture that sense that we come in as a sport scientist from maybe physiology backgrounds, but then we can expand our scope across the football environment to help a number of different areas with that idea of how they organize their data, how they interpret it, and even how you review your processes and keep score. So, if anyone's interested that's probably a good place to start for a bit of a read on that.
Dr Tom Kempton:
But getting back to the question I guess, that probably shows that we've moved out from just being focused on training and now it's a year cycle where you can touch across different parts of the football department. So, preseason in Australian football is quite long, it's about four or five months, which is quite different to a lot of pro American sports and European football. So, it's quite a long training block. And in that time the focus is on preparing the players for the matches. So, there's a lot of training, three to four main sessions a week. So, all the analysis is really focused on optimizing player availability for training and building fitness levels and minimizing injury, those kind of things.
Dr Tom Kempton:
But it's also a time where, because there's no games and you do have a little bit more freedom, is where you can really dive into doing a lot of analysis and projects and more of that slow process, rather than the day-to-day. Whereas in-season it changes and it becomes really regimented, so you're following the weekly cycle between each game. And then people like to be set in their ways a little bit in-season, and so you're not really bringing a lot of new things in or changing a lot. But you're hopefully implementing some of the projects and some of the learnings you discovered during the preseason period.
Dr Tom Kempton:
The in-season as well is interesting because there's the performance every week, so that becomes the focus. And then there becomes a notion associated in the club depending on if you're going well or not. So, that's a layer that can affect the type of analysis you're required to do.
Dr Tom Kempton:
And then I guess off-season some sport scientists and data analysts now get involved in the trade period and player recruitment. I think we're a little bit behind the US definitely in that space. They really set the bar, particularly in baseball, for how they go about objectively valuing players. But I guess yeah, it kind of shows that you can have touch points across all the football department, and across the entire season for sure.
Dr Mark Watsford:
Yeah, it's certainly not limited to just influencing coach's decision-making, it does go across the board. Andrew, have you got anything that you do with the more management side of things? Or is your role more specifically on the athlete performance side?
Dr Andrew Novak:
Certainly more the athlete and performance side. I guess coming from a sport science background initially that's made sense that a lot of my main stakeholders that I work [inaudible 00:28:32] with the S&C coaches and sport scientists at each team. I guess my role in terms of the yearly basis is probably a little bit different to somebody like Tom who's got that quite long preseason period, off-season period. For us we've got multiple teams playing in the Super Rugby competition, and by the time that ends Wallabies are starting up. And by the time that ends we're almost back at the Super Rugby again. So, we're constantly balancing these fast projects, and slow, longer research projects where we try to consolidate a lot of data and provide feedback about what that season looked like for example.
Dr Andrew Novak:
But when you've got the next season already starting and those guys are wanting you to analyze matches, we kind of have to find a way to balance that. I guess a good example would have been last year where we changed over GPS systems completely between Super Rugby and the Wallabies season. And so, we're kind of deciding do we actually analyze that Super Rugby season and provide feedback on that? Or do we wait until they've got data from the Wallabies with these new GPS units, run a validation study, and then proceed forward from there?
Dr Andrew Novak:
So yeah, we're constantly trading off those in our team. I'd say generally for me I'm focusing a little more on those longer, slower, bigger research projects, like taking all of that data and putting out some sort of report or injecting those findings into a dashboard that's then used in the next season for example.
Dr Andrew Novak:
I guess a lot of my role throughout the year is what we call feature engineering. So, because we've got quite a small analytics team, we don't have a huge team of video analysts. What we do is we use the Opta database to get as much information as we can and make that as useful as it can be. So for example, coaches were interested in looking at how often teams are getting into the attacking 22 zone and how efficient they were at converting that into points.
Dr Andrew Novak:
And so, that sort of information isn't explicitly coded by Opta, and if we wanted that, video analysts were having to do that themselves manually. So, what we did was take the Opta database and find a way to engineer those features by structuring the data and running a very specific set of calculations, and then all of a sudden that's not a manual process and we can apply it across all of our database for all the teams and all matches, and we can run a research project on that. So I guess that's kind of where my role fits a lot of the time. Is sort of adding value to the data that we've already got kind of thing.
Dr Mark Watsford:
And yeah, both Tom and Andrew both mentioned a bit of a theorized approach that there's really these great periods where you're working on different things, it's nice to be afforded a bit more time in that off-season evidently. Pat, have you, in terms of your work away from the field, but in terms of front office and the more business side. I understand you have a fair bit of discussion, you have a role that you can certainly work with player recruiting, whether it's longterm athlete development or scouting. Can you just give us a bit of insight into that side of the analytics world?
Dr Patrick Ward:
Yeah. I mean, it's probably as people would imagine it to be, if you think of the movie Moneyball where there's scouts and people making decisions, and then there's a bunch of nerds who are running computer algorithms and trying to I guess interject the information somehow into the pipeline. The extent to which we would participate depends. Obviously today was the trade deadline for the NFL so we can't make any trades now, so it ended at 1:00, so there's a lot of stuff now that I wouldn't be doing, that I was doing leading up to this. The next big push is going to be immediately when the season ends we go into free agency and then draft mode.
Dr Patrick Ward:
And so, we're trying to build models that maybe, depending on the type of situation. If it's draft, you're trying to build or identify models that can help project a player's value to your team within their first four years, right? So, that's generally a four-year contract is what a rookie would get. If it's free agency, now we're talking about players who are in second, third, maybe even fourth contract in the NFL, we're looking at models that are more along the lines of kind of like career curves. So, where the rise and the pique, and then I guess the fall of the athlete as they age, aging curves. So, a lot of that stuff is targeted towards what kind of question we're trying to answer.
Dr Patrick Ward:
And then I think the other bit, which is the part that is most interesting to me and the part where I think sport science has continually dropped the ball in team sport, is trying to map specific aspects of a player's physical ability to components within the team sport game. And I'm not talking about aggregated stuff like how much player load did the guy get in the full AFL contest. I'm talking about very specific tasks within the game where we can identify players that have unique attributes to perform those tasks. And so, that's the third part where I try and, if nothing else, influence the conversation. And that to me is really what data analysis is about. It's not to supplant expert judgment or things like that. But it's there to help evaluate a process and help to provide information for a discussion.
Dr Mark Watsford:
You're trying to be a front office influencer, and not to be confused with an Instagram influencer.
Dr Patrick Ward:
Yeah, exactly.
Dr Mark Watsford:
So, Sam Marshall's asked a question that's related to what Andrew was just talking about. What's your ratio use of using provided metrics by external companies and commercially available ones, versus your own data? Tom, have you got an answer to that? How much custom stuff do you use versus the originally supplied stuff?
Dr Tom Kempton:
That's a good question. Off the top of my head I'd say we don't use any kind of derived variables that we don't create ourselves. So, obviously we'll take an input like total distance or whatever it might be from GPS. But yeah, we basically use all our own metrics.
Dr Tom Kempton:
But probably IMA is something from Catapult that we use, which is a measure of change of direction, acceleration, deceleration, which we've found the most problematic aspect of physical output in games to measure. GPS is pretty strong for measuring your distances and your velocities and things like that, but to measure it when you're having changed direction there's a fair metabolic cost to that, and a lot of load on the lower limbs, and so how do we measure that is pretty challenging.
Dr Tom Kempton:
So, IMA is one that Catapult have put forward. And we've done a few internal validation studies in collaboration with a research partner, and just to give us a bit of an idea of how reliable it is, and it's good enough. But yeah, I think generally we just try and use the raw inputs and then create our own metrics based on that where we can.
Dr Mark Watsford:
Andrew, similar or different?
Dr Andrew Novak:
Yeah, so as I said, because we've got a fairly small team I would say we probably rely on those measures a lot more than others, which is why I always go down that feature engineering process. Because then, say with the [inaudible 00:36:45] 22 example, if we can derive that from Opta, then that all of a sudden frees up video analysts to be coding something else, I can use their time on something else. So yeah, I'd say we probably use it a bit more heavily than other teams who might have a bigger budget and more video analysts to code what they want to code manually I guess.
Dr Tom Kempton:
Just to clarify, we definitely use inputs that are coded. So, [inaudible 00:37:12] or whatever it might be, we'll definitely take all their raw inputs. And we'll ask them about what their coding reliability is and things like that, which some companies are really forthright with, some aren't. But we'll definitely use those raw inputs. But in terms of using a SuperCoach score or a magic number that companies provide, that's what we avoid. Having said that, we use Champion Data's player rating because that's something that Karl Jackson validated in his PhD. So, things like that we will use if it's validated. But yeah, just to be clear we will take raw inputs and things like that.
Dr Mark Watsford:
And Pat, just a quick answer to that. Custom or factory settings?
Dr Patrick Ward:
We try and customize everything to the specific context of our team, our season, our coaches, the context of the game, those kinds of things, yeah.
Dr Mark Watsford:
Okay. Let's just change tact a little. Let's talk COVID. How has the role of an analyst changed, if at all, this year? And I'm going to throw that to Pat, being I guess six months since the commence... well, maybe seven or eight months now. How's life as an analyst changed because of COVID?
Dr Patrick Ward:
I mean, I get my nose swabbed every day. It's pretty much the same. The thing is, which I find interesting, is that we went through this whole off-season where we couldn't do anything and everyone was quarantined and staying at home. And I think if you're in a role where you have to work hands-on with the athletes, maybe a strength coach or a fitness coach or a physio or something like that, you're really sitting on your hands because you're like, "Jeez, if the customers aren't here I have nothing to do."
Dr Patrick Ward:
But for me, as long as I can log into our database I can do all my work. And actually it's sometimes worse during quarantine because the customers are everybody that has a question in the organization, and if there's nothing going on, all of a sudden people have lots of questions about things they've been pondering and querying. And so, I think for me it was probably a little more busy than anything.
Dr Patrick Ward:
So, it really hasn't changed. I mean, as an analyst I can pretty much work from ever. I still go into the office, but there are a couple days a week where I just work from home for half of the day. So, I can't say that it's changed a whole lot for me.
Dr Mark Watsford:
Okay. Tom?
Dr Tom Kempton:
Yeah, so my experience has been a little bit different. So, we had the situation where a lot of people probably know, we had to go into a bubble environment, so it meant relocated to another state. And in that situation it was great that we could continue our work, but I think we went there with fewer staff, and so there was probably a little bit less time to focus on the data analysis and more about what needed to be done in the day-to-day moment. So, on top of your normal role, supporting that, there was times where we were up scissor lifts filming training or washing jerseys or packing up the gear truck after a game. So, that kind of probably did take away from the time available to do more classical analysis.
Dr Tom Kempton:
But I think, yeah, the way the industry's been impacted by COVID, there's in Australia particularly cuts in budget across football departments. But I think the analytics department is possibly going to be affected by that.
Dr Tom Kempton:
But one interesting area I think is, with the fewer staff available, that the types of work that Andrew spoke about where you can gain efficiencies in processes and things like that. I think that can be really useful contribution from data analysis in the more leaner industry we're faced with for the next couple of years at least.
Dr Tom Kempton:
So, I think that's a really good opportunity for us to position the worth of a data analyst. Because often it'll come, the GM will have to make a decision between a data analyst or a coach when they're doing the budget, or whatever it might be, or a training camp. So, I think it's important to show the worth in the application of this and not be perceived as a luxury. That's a real issue in the environment we're in at the moment.
Dr Mark Watsford:
That's good [inaudible 00:41:37] smelling roses still, and I think maybe getting back to basics and washing some jerseys or running some drink bottles has brought you back down to earth, hey Tom? Pat said before about the rear view mirror and a side mirror can both provide some info, and then Tom mentioned some budget cuts is things that pretty heavily affected in most Australian codes going forward for the next 12 or 24 months. Hopefully it's not like a truck sideswiping that side mirror, so that people would only rely on that rear view mirror. Yeah, we'll wait and see. Andrew, has life changed because of COVID for you in Rugby? Or is it business as usual?
Dr Andrew Novak:
Definitely not business as usual. Yeah, so a really strange time, it was obviously unprecedented and really unknown. We had this, I think we got seven rounds in or something before it was canned. And then when we came back we had a different competition. So, there was a lot of analysis that had to be done around that when new competitions started up that coincided with different rules, so there was more [inaudible 00:42:45] in playtime and everyone wanted to know, "Okay, what's the psychical intensities of this new competition compared to the old?" So, there was a lot of that.
Dr Andrew Novak:
I guess before we even came back there was a big focus on the return to play, because again, going back to that research framework, there was no evidence here on anything like this before. And so, this is where it was largely up to the domain experts, the S&C coaches, to basically come up with the milestones that they wanted players to achieve before they returned to play post-COVID.
Dr Andrew Novak:
I guess once we did come back, there was certainly bubbles, and certain teams had to travel out of Melbourne and so on. One of the issues that we did have is that teams could only have 10 staff members with them on field, and that I don't think in any case included sport scientists. So, the guys running the GPS units weren't even there, they're back at home. And that all of a sudden became a lot more complex to try and manage that data quality. It certainly did drop off to some extent, because you've got another couple of pieces in that puzzle that became really complex.
Dr Andrew Novak:
So yeah, certainly not business as usual. I'd say in my role, working with rugby and UTS, it's technically a 50/50 split. But working in professional sport, that often gets swayed a little bit more towards the industry side of things. I did have a bit of opportunity there, because of that, to push back a little bit and spend a bit more time on the longer-term research projects. Yeah.
Dr Mark Watsford:
Yeah, sometimes a 50/50 split becomes a 70/70 split, hey? When you don't know where to put your time.
Dr Andrew Novak:
Exactly.
Dr Mark Watsford:
So, Sammy Palmer's asked a question. Do you have any experience with shifting perceptions of coaches that have previously been hesitant to incorporate data into decision-making? And Aaron has also asked a similar question about trying to actually instill the info to the end-users, when they might not want it or they might be intimidated by the use of data. Maybe Pat, have you had any experience? And what do you do to try and shift perceptions about the use of data?
Dr Patrick Ward:
Well, I mean I think most people... This stuff is pretty new to the NFL, it certainly isn't like baseball and basketball where this stuff was new in the early and mid-2000s. So, many people are skeptical and unsure of it. And I think too rightfully so, their livelihood depends on making good decisions, and they don't want to just forgo everything they've previously done.
Dr Patrick Ward:
So, the ways that I handle that is the same way that I explained, them keeping score of their own intuitive processes. I mean, we keep score of all of our own models, and I think that's a key thing, being really upfront about the uncertainty within the models. I think a lot of times, most of the people, and I think it was Tom that mentioned earlier election forecasting, and most people think of outcomes as right or wrong, or black or white, or yes or no as binary. But really it's a probability distribution. And so, just because there's an 80% chance that a player is going to be valuable, there's still a 20% chance that he might be a complete bust. And that could be for any number of reasons, and some of those things we can tease out in a model, or interrogate the model and figure out why. And some of those things we might never know because it's just either information we don't collect or we have a bit of epistemic uncertainty or stupidity.
Dr Patrick Ward:
But we keep track of all of that stuff, and we show our model errors. And we're as open about the times that we're right, as we are the times that we miss. And I think that is really important because it gives people this relaxed sense to speak to you more. It's not like a scientist in a white ivory tower, who most of those people have come under a lot of criticism lately with the entire COVID thing. So, we try and be very human about it, and have a lot of hubris about the things that we see with our analysis, and how they can contribute to the conversation.
Dr Mark Watsford:
Yeah, I think that's important to note when you're right. But yeah, don't just omit those times that you might not be right. Tom, how have you managed to either crowbar in some info to some skeptics, or have you got any tips on how to actually convey the info to those that might not necessarily be ready for it?
Dr Tom Kempton:
Pat had a few really good points there. But I guess what I'd add to that is the relationship with the key stakeholder becomes really important. So, if you can work on building that relationship and so you're not just a guy who turns up with some numbers but you're someone that they know a little bit and know that you're invested in the program, I think is helpful.
Dr Tom Kempton:
I think a real challenge for people in our position is like Pat said, we're really open about our failures, and we like to talk in probabilities and likelihoods and things like that. But often coaches just... and I've had previous head coaches say, "Just tell me yes or no. What should we do here?" And you're trying to explain, "Well, it's not quite that simple."
Dr Tom Kempton:
But I've seen a lot of people in high-performance who are very successful who can sell a message, right or wrong, sell a message and deliver that plan to its completion. So, I think that's an area we need to get better at. And it's constant tension that how do you stay true to your scientific roots and your understanding of probability, but still sell a compelling message and bring people along the journey with you? Because that's a lot of what's important in sport. So, how to do that without selling yourself out is a real challenge.
Dr Tom Kempton:
But by the same token, if you can't package a good product I think you're going to struggle to gain influence with the key decision-makers who, as Patrick rightly said, they just want success, they don't care about how it came from. So, they're not against data, it's just they might not have had exposure. And they've had to fight a long, hard journey to get to where they are, and they don't want a college kid with an MBA to come and ruin it all for them. So yeah, I can understand those tensions that exist in clubs.
Dr Mark Watsford:
Yeah. We've had another question from Shaun [Stolp 00:49:08] who... it's a two-parter but I might just ask the first one. So, how do outliers play out in your real-world work? So, in academia, models are built to try and satisfy assumptions and predict the majority of cases, but in the professional sport-world it might be useful to predict rare cases of brilliance or the real high-achievers. Andrew, I'll start with you. Do you ever omit outliers or do you have a deal with... or what do you do with outliers when you're building models?
Dr Andrew Novak:
Yeah, I mean that really varies from project to project, and depending on [inaudible 00:49:44] trying to get across to your stakeholders. In general, I mean the main thing you need to do with outliers is work out are these outliers realistic outliers, or are they due to some sort of measurement error? Anything that's due to a measurement error you're going to want to remove. If it's a legitimate outlier then that's probably something that you'll want to be reporting on. So, it really depends on the metric and the message you're trying to get across.
Dr Mark Watsford:
Yeah. Pat, do you omit them or do you work with them? With outliers?
Dr Patrick Ward:
Well, I think... I mean I guess I'll start by saying whether it's academia or the real-world, it's industry or applied science. All models come with a box of assumptions that sit aside them that we have to be aware of. Performance at a higher-level is right tale, there's going to be a whole bunch of people that aren't good, a bunch of people that are average, and then the few that become the hall of famer and the elite and et cetera. And so, being able to classify those players correctly is always a trick, because they are small and they're few and far between, there's only so many.
Dr Patrick Ward:
So, what we try and do is use modeling strategies that can allow for some dependence on the population. So, obviously it's a homogenous population, if they've gotten to the NFL they've already been weeded out from the mortals like ourselves. So, we try and use models that have some sort of dependence on the average for the population, but allow for a fluctuation to occur when we see something unique.
Dr Patrick Ward:
A good example would be if I built a very simple model to project quarterback success in the NFL based on draft round. And if you did this, you could go on pro football reference and pull data and do this, you'll see that if you draft a quarterback in round seven, you have the highest chance of that quarterback being a hall of famer. Because Tom Brady was drafted in round seven, and he's arguably the greatest quarterback of all time.
Dr Patrick Ward:
Now, that's obviously a silly model, and if I took that to our GM he'd laugh at me, right? Anyone would laugh at you. So, we wouldn't build the model that simplistic, right? We would go to the next step to say, "Oh, on average. Or for the population we have an expectation of this." However, if we have seen enough performance from Tom Brady, the model begins to change and represents something that's slightly different. And oh, by the way, as it's representing slightly different, we see that this guy is substantially better than the population. And that's how we try and account for those things, is within the modeling process.
Dr Mark Watsford:
Okay. It's complex, but that's why you can't just have those amateurs that are trying to dabble in your space without that real pedigree of experience and that inquiring mind to actually go about it properly. I want to just ask you guys, I'm just conscious of the time, we're heading towards the hour. Anyone else that has a question, feel free to post it in the Q&A now.
Dr Mark Watsford:
Tom, how do you see people getting into this area? Have you got any tips for people who are trying to become a performance analyst, a data scientist in the sporting context? Any tips for people who are listening in today who might want to work in this area? They've been inspired by you guys.
Dr Tom Kempton:
I think there's two parts to that question. And firstly, or why this area's interesting, we've got two main groups of people that interact in this space. We've got some people who might be involved in the UTS master's course for example who want to be high-performance specialists or S&C coaches or GMs or however they want to be. And I think they need to understand enough about the data science principles, so they know what to look for in the higher and how to interact with that person, how to direct them, and how to best leverage them. So, I think there's that group of people.
Dr Tom Kempton:
If you move again, aside from that, but people who want to specialize in data science who that's really their main focus. In sport, we're seeing two groups and two pathways. We're seeing people who are sport scientists first by training, and then they pick up a little bit of data science skills and work in that area. Or you get people from a data science background who like sport and have done computer science at university, and then they come into sport and work in that space.
Dr Tom Kempton:
So, I think there's two different paths, and they both have their advantages and disadvantages. I think if you've come from a sport science program you probably understand a little bit more about the sporting environment and about performance and things like that, and hopefully your data skills are pretty strong. But you'll typically find that someone from a computer science background will have a lot better programming skills, unless there are exceptions like Pat who is an excellent data scientist that has come through that sports program. But my data science skills are definitely not as advanced as someone from a more traditional background. So, I guess it depends what the organization needs and how they're looking to fill that role.
Dr Tom Kempton:
But I guess I touched on a few pathways that are formal. But I guess on top of that, either way you go to study, I think there's a lot of external programs you can do to upskill yourself. So, there's a whole heap of online courses in all areas of data science that are available for free. And there's some really great ones there so I recommend engaging in that.
Dr Tom Kempton:
There's a really good online community as well of, if you're learning technical products like R or something like that, there's so much resource out there to help upskill. Pat actually hosts an award-winning Podcast called TidyX I think it is, which is really worth checking out. So, he combines a lot of R coding to practical examples.
Dr Tom Kempton:
And I think finally as well a lot of people get hired who are starting a blog or something like that, so particularly in America. But they start a blog, they publish work, and people notice them that way. So, I think that's a good way to build a following. The community in Australia's small, but there's some pretty good guys online.
Dr Tom Kempton:
I just feel like the issue you run into is data availability. So, in America there's a lot of great data available free. In Australia, it's not so much the case. So, it could be potentially worthwhile looping up with a professional team to actually gain access to data. They might not let you go publish it, but at least you'll get a good data source to build your models on and get your expertise, and in return the team will get some analysis. So, that'd probably be some advice I guess I'd give. And I'm sure Pat will give the details for his podcast as well [crosstalk 00:56:49] checking out.
Dr Mark Watsford:
[crosstalk 00:56:51] one for the UTS master's, one for Pat. And I'll just segue that, but when we do the performance analysis and data science subject in our master's, the four people you see are the preparers and the brains behind that subject on the coordinator, but the expertise is going to be created and curated and developed by Pat, Andrew and Tom. So, it's an exciting thing to look forward to if you are enrolled in that master's subject.
Dr Mark Watsford:
I might just ask one more question, and I'll give the floor to Pat as well after that just for one more question as well. But Andrew, Aaron's asked a question about threats and opportunities in performance analysis. So, where do you think we should focus our attention, so that we can minimize the threats if there are any, or maximize the opportunities? Where do you see the next medium-term, three to five years, heading in performance analysis?
Dr Andrew Novak:
Good question. I want to say something around computer vision, I think that's a massive opportunity. But I don't know how close it is, it's been talked about a lot, it's been used in some cases to track players and then used that data instead of GPS for example, and that allows you to monitor the opposition, which you can't normally do with a GPS system and those kinds of things. But that moving into we're including pose estimation and things like that, I think that's a massive opportunity.
Dr Andrew Novak:
But I don't know, there's going to be so much data. We're already swimming in data, and that's just going to add more and more to it and I think it's going to become really complex. And I think that can also be a threat, because I was talking to Blake about this earlier. A lot of sport scientist working in the industry, they largely used their GPS analysis these days, and so that's potentially a threat there when you're automating that through something like computer vision.
Dr Andrew Novak:
But in terms of minimizing the threats. I guess adapting to skills, more of these data science skills being able to actually use that data. I think everyone needs to develop good research skills and methodologies, coming back to that research framework, so that you can actually look at what's the... Somebody wants to use this data and start looking at certain metrics, you can actually apply a framework to go, "Okay, what's the theory behind this? What's our hypothesis?" And actually work through it and problem solve it and decide whether it's worth using or not. So I'd say, yeah, probably that is the best way to mitigate.
Dr Tom Kempton:
Just on the video tracking, I agree, that's a really exciting area, the possibilities there are huge, and there's some good groups in America. But I saw recently in soccer, a game in I think it was Scotland, where the camera is automatically trained now to follow the ball. And there was a linesman running up and down the touchline who has the same hairstyle as me, and they kept mistaking his head for the ball, so the camera kept panning down and focusing on this linesman's head. So, I think there still is a way to go [inaudible 00:59:59] the human element.
Dr Mark Watsford:
There's a sponsorship opportunity there [inaudible 01:00:04]. Pat, you can either answer, give us a tip for getting into the industry, or where you think we're heading in the next three to five years in this space, your choice.
Dr Patrick Ward:
Yeah, all right.
Dr Mark Watsford:
[crosstalk 01:00:14]
Dr Patrick Ward:
Yeah, I think Tom's last point is probably the one that I would say, just because it's the way that I kind of got in as well. Is doing work in the public space is so important, and it literally costs you nothing to have a Twitter account or a blog or a screen cast, like I have. It literally costs you nothing to do that stuff, to put out information. Even if it's on Twitter literally doing your own analysis of anything, it could be stuff that people have done a million times over, but just doing it yourself and putting it out to the world, it's a great way to have people interact with you, talk with you, offer ideas and thoughts et cetera.
Dr Patrick Ward:
So yeah, our screen cast was... I mean, like I said, my blog I had, that kind of helped me get into the field in terms of getting noticed by people I guess you could say. The screen cast was sort of our way to give back. So, there's a project called the TidyTuesday project, which is every Tuesday. A new dataset is put out by TidyTuesday, and it's part of the R for Data Science Community where anybody can download it and put a graph up on Twitter, and then people comment and talk.
Dr Patrick Ward:
And so our thought, myself and my cohost, was there's a decent amount of people that are doing this, let's put up a poll on Twitter and see what is holding people back from getting involved. And most people were like, "Oh gosh, I'm so nervous because I'm just trying to learn how to code, and if I put up something that's just a simple dot plot or bar plot or something, are people going to think that that's crap and et cetera?" So, we decided to start a screen cast where every week we would feature someone else's R code, and we would go line by line and explain it for anybody that was learning. And then we'd follow that up with one of our own examples that was usually sport-based, and we'd try and teach something, whether it's building Shiny apps, whether it's web scraping. We try and teach something within that as a way to give back.
Dr Patrick Ward:
And so I think, doing work in the public space is so important. If you don't have access to data, if you're in Australia, use American websites and do American analysis. The other thing I would say is maybe if you're not used to exercising, it's a good time to start. Get a Garmin watch, and you can download the data right from your Garmin Connect and you can automatically start to have your own GPS reports. And that's a great way for students to learn how to do GPS analysis. So, when they get out and they try and apply for an internship with Andrew, the first time they see a GPS isn't doing their day-one of their internship, right? They have a background in that stuff. So, that's how I would say, is public work is so valuable, it's a great way to interact with people and learn.
Dr Mark Watsford:
Yeah, that's good advice. My son just started a pamphlet route, and so he's got all the chances of metrics, of elevation, number of pamphlets delivered and things. So, I might [crosstalk 01:03:18]-
Dr Patrick Ward:
So, is that going to be information that your son is going to analyze, or are you making him a science project? Like you're going to attach all these things to him and see...
Dr Mark Watsford:
You can decide which way it's going, Pat.
Dr Patrick Ward:
You realize if you move faster you can get 10 more homes on the route, and that leads to X amount of money. So, now you're monetizing it, that's really important.
Dr Mark Watsford:
Paying me more rent, that's right.
Dr Patrick Ward:
Yeah.
Dr Mark Watsford:
So, I'm just conscious of the time guys. We have to wrap it up. I've taken that we really need to ask relevant questions, that sort of came up early, and I think all three of our panelists were pushing an approach that contextually we need to be very mindful of asking the right questions. Video is still a very important tool, and framing the useful information in a format that the coaches want so that actually conveys our message really well. Need to know the limits of the models we're using. And we really need to just make sure we're practicing the art, by getting any data that we can. And there's lots of that publicly available data that's just been flagged in that last five minutes.
Dr Mark Watsford:
Hopefully, all of you out there in audience land enjoyed the chat, and got a few take-home points of your own from that. We really appreciate the time that you've taken, Pat, Andrew and Tom, to share some insights in what you do for work. We really wish you all the best in either the upcoming season or the current season. And yeah, as we said before, these guys are going to be involved in the UTS course over the next six months or so. So, I'm going to bid everyone farewell. I'd really like to thank the panelists. Everyone can give a virtual round of applause. And hopefully everyone has a great rest of day or rest of the evening if you're in the US. But thanks everyone for tuning in.
Enjoy the webinar? Get notified about the next one.
About the webinar
Data in sport is a valuable tool, however interpreting data to get the most valuable output for athletic performance can be challenging. In this not to be missed webinar, we will explore contemporary concepts in performance analysis, including the integration of large data sets from multiple sources, to improve decision making and positively impact performance outcomes.
This FREE webinar was facilitated by Associate Professor Mark Watsford, alongside a panel of HPRC researchers, alumni and associates who have industry knowledge and experience working in a range of high performance sporting codes in Australia and the USA.
The panel
Dr Patrick Ward
Patrick Ward currently works in the Research and Development Department of Seattle Seahawks where his role centers around data analysis on a variety of different topics within the sport of American football. Prior to joining the Seahawks, Patrick worked as a sport scientist within the Nike Sports Research Lab. Patrick has a PhD in Sports Science and is a Certified Strength and Conditioning Specialist (CSCS) with the National Strength and Conditioning Association (NSCA). Patrick's research interests are centered around the evaluation of training demands as they apply to athlete well-being, injury, and performance across a variety of sports.
Dr Andrew Novak
Andrew Novak is conducting research and developing data driven solutions in human performance (primarily collective team behaviours, skill acquisition, athlete monitoring and esports performance). As an industry-embedded researcher, Dr Novak collaborates with rugby coaching staff across Australia’s elite rugby programs as well as UTS academics. His work focuses on developing methods to analyse and interpret the complex interactions between physical, technical and tactical characteristics that underpin elite team performance.
Dr Tom Kempton
Tom Kempton has been the Sport Science Manager at Carlton Football Club since 2015 and previously worked within the NRL. Tom has an extensive research profile examining quantifiable factors that affect performance in team sports.
This webinar was facilitated by Associate Professor Mark Watsford.