Designing learning dashboards

Posted by on 28 November 2017

This episode is for learning professionals who are ready to dive deeper into their learning data. Robin speaks with Stewart Rogers from Lambda Solutions about framing your learning analytics with visual dashboards.

Subscribe using your favourite podcast player or RSS

Subscribe on Android Listen to Stitcher rss icon

Sign up to our email newsletter to get podcast updates

Links from the podcast

Transcript

Robin:
Hi, it's Robin here, the host of the Learning While Working podcast and the founder of Sprout Labs. In this podcast I'm talking with Stewart Rogers about designing learning dashboards. Stewart works at Lambda Solutions on a learning product called Zoola.

Zoola works with Totara Learn and Moodle and xAPI data, and has exciting possibilities where you can use a basic learning record store which is recording data, and then use Zoola to add really powerful learning analytics and visualisations. This podcast is a really deep dive into learning analytics. Sometimes it's a bit technical - hopefully not too technical.

Dashboards have become commonplace in enterprises as a way to rapidly visualise and understand performance in a business. David Hegarty, one of the other principles at LearnD, talks about how he attended a board meeting for a publicly listed company he was working with. He realised that the conversations were focused on the dashboards and the scorecards, but at the same time there was no visibility of performance data in any of the dashboards or scorecards. He walked out of the meeting and then worked over a period of time developing those scorecards to be able to give visibility of learning to a senior level.

In the podcast, Stewart gives a really nice framework for focusing each dashboard on a single KPI. He also has lots of other great advice about getting started in learning analytics.

Stewart, welcome to the Learning While Working podcast.

Stewart:
Thank you. Happy to be here.

Robin:
You work with a learning analytics product called, Zoola. What is that? And how does it work?

Stewart:
Yes, Zoola analytics is a reporting and analytics solution, tailored specifically for learning analytics, if you will. We started this about three years ago, probably went to market maybe two and a half years ago. Our ultimate objective is to open it up, give you access to all your data and let you build the reports, dashboards, charts, whatever you need to on top of that data in a way that makes you feel good about it. Which I guess is kind of a fun way to say it but we just got kind of tired of limited access to data, not able to access data. Time spent being kind of a classic example - can't accurately report on how much time people are spending online learning.

And so, helping people with that. But then also on the flip side of that, tables and charts, making things that look nice so that you can just spit it out, put into a presentation, put it on a dashboard, post it somewhere, without a whole lot of reformatting in Excel.

Robin:
Yes, learning at the moment, learning analytics is a hot topic and the first step is actually getting the data and collecting it and then you're really working with getting it into the right formats. Lori Hofmann, in another podcast, talked about how she was actually using that time, engagement time thing, just to be able to measure and think through engagement, and what people are engaging in, affecting learning design outcomes. So, it was interesting, because she was actually incredibly low tech about the way she was doing things, so she was avoiding some of the issues that you seem to be solving.

Stewart:
Yes. We try to do the same thing for our customers as well too. They start with very basic questions around time. How long do they spend in a course? Online, in a course. And then it grows from there. So within a course, how long do they spend looking at the PDF, watching a video, answering the quiz, you know, getting into the specific time. Is it time growing, or shrinking over time? So, we get a little bit deeper into it. But yeah, the first one is just usually a simple query. Show me the course, show me the student, show me how much time.

Robin:
So it leads into my next big question. When you come to designing a learning dashboard, what are the steps to think through?

Stewart:
There's always a couple of things. Maybe the easiest one, if you've got the right mindset for it, if you will, is the visual aspect of it and I'll come back to the content of it in a bit. But we're finding even just in general, people kinda get hung up with, I dropped three or four things into a dashboard, couple of different data points, different charts, different tables. Add in some opportunities to interact with the data. Whether it's just high-level, filtering exploration type stuff, or even kinda drill-down, drill -rough type stuff building that into the dashboard.

And we find that technically putting the pieces in place are easy enough, there's my filters, there's my chart, there's my drill-through today, that all works. But getting it to fit back into some other app, whatever that is, either a stand-alone dashboard or embedded into something else. Getting that to fit in seamlessly and with a nice experience in giving people easy ways to interact with that data, even though all the capabilities are there, but doing it in a way that's usable is kind of one of the bigger challenges. So a lot of you are just trying to come up with nice colours and a nice theme for it is sometimes the effort.

So then coming back to the data then. We do find that there's quite a bit of thinking that has to go into that as well. So, getting dashboards that are focused to specific things and not a mish-mash of: here are three or four different data points. And so, you're having a dashboard maybe is interactive, but also oriented around maybe even just - different things will call or different thing, but the reaction or the satisfaction with different courses. So that you can visualise summaries, even with sometimes get the wrong types mixed up, but a heat map of some sort to kind of show, wow my really popular highly rated highly satisfied courses are green. The ones that are a little less so are red, and then being able to drill down from there.

So, picking the right visualisation but keeping the data all focused to kind of one topic and then maybe evolving dashboards into quiz results or grades analysis. And having different data points that all hook together nicely. So that it is kind of one holistic experience from the dashboard perspective. These are all the things that people get tripped up on. We do see quite often, first dashboard they've dropped in: that one and that one and that one. And then none of them are connected and they interact and drill out into other places and people get lost.

So, there's a couple of things right there. It's just getting good, focused dashboards on: here's a KPI that we're measuring, here's a dashboard for that and then building out a nice experience for people to interact with that. And then repeating that for the next KPI and metric. You don't need all of your KPI's in one dashboard, on one page per se. So that's usually where we kinda help some of our people with this, is just efficient design of those dashboards.

Robin:
Yes, so essentially the principle you're talking about for each dashboard to really just look at one particular thing and then really just focus on that. And not try to built a single dashboard that brings in everything. You might have that, if you have everything, you have it in layers, Stewart? Is that what you're saying by the whole drill-down notion, that you might have the, say four KPI's on the first one but you drill into each one of those in depth?

Stewart:
Potentially, yes. If they get to a point where they've got four or five KPI's that they're monitoring, then you could summarise all that back into one top-level dashboard. But you want to build in the drill down capabilities so that you can get from: okay that chart I need to know more about that. Why is that red? I'm gonna drill in and get the detail. And maybe it's two or three different visuals, and then you're going to drill into the next red block and get behind that.

So, layers is kind of a good way to describe it. The sort of classical BI terminology for it is drill-down or drill-through. But yeah, you could look at it as kind of a series of layers. And it's not uncommon. Usually the first question is, "That's a nice chart, give me the data." And I've played that card myself internally but - which could just be the next layer below it.

Robin:
It's actually, I think L&D separates into two fairly large camps for people who do care about the data and people who - it's really quite foreign to them. So, even as we were talking I was sitting there thinking: thank you for talking about drill-downs because I was doing a whole, "Oh, maybe we need to explain that term." Because that might be really foreign to some people.

Stewart:
Yes, good point. It can be. So drill-down, for me, at least maybe somebody has a different definition but, drill-down for me is, you start with a summary and you drill down into the detail, right. So, but it kind of all stays within in the contained information. So I go from a chart, to a detailed chart to a table, to a more detailed table. You don't have to go that far but it could go that far. For me drill-through then, it takes you out back into something else. So I've gone down and then through into something else.

So we do a lot of learning analytics for Moodle and Totara Learn. And so we build in the capabilities for them to: I've got a chart that summarises the information. I'm going to drill down, I'm going to see a detailed record of each thing that happened. And maybe within a course. And then I'm going to click something in that chart, I'm going to drill through back into the LMS. And so now I look at, in the LMS, the actual issue, so I can resolve it. Whatever that is, I can resolve it right there. So we drill down into the detail and then through back into the LMS so you're looking at the source data. That's how we build it out for our customers.

Robin:
I suppose that's possibly one of the real powers of a dedicated learning analytics tool rather than generic BI tool, as well? Because you have that ability to have that tight relationship between the learning system and the analytics.

Stewart:
Yes, the general purpose BI solutions feel very general purpose. And they're built for, typically, extremely technical audiences right. It's the data analytics team, it's gonna crunch all your BI data. Those people are highly specialised, highly skilled in data analysis. The capabilities are built in for those kind of people. So we've taken those concepts and tried to make them as learning analytics friendly as we can.

Robin:
Yes. There are also two really interesting terms we're using here as well. I think it's something to just think through. That is the visualisation side of it and in some ways, everyone starts to talk about analytics as being visualisation. But there's quite often another layer of analysis work that happens before the visualisations. That might be statistical analysis. What are your thoughts about that layer of analytics in terms of learning?

Stewart:
Yes. I kind of always use visualisations a little bit generically, even myself. I think some people naturally leap to: visualisation equals charts or dashboards. But for me it could even just be a table and raw data extract. If you want to visualise it that way, I'm good with that. So there's definitely the layer, and again I think we can maybe open up different things and find different definitions for this. But, even for me, visualisations are really the output of analysis and so for me those are reports. And it can also generically use reports to represent those as well, too.

And so that layer in the middle then - the analysis layer is sometimes the complexity that's built into your reporting, your learning analytics, right. So I have a bunch of raw data, and I'm going to do some analysis. And sometimes it's just get data, put data here. And in other cases you got to do a little bit of computational statistical work, even if it's just simple averages or medians, just to kind of give you some middle points of your data.

But it doesn't haven't to be that complicated for people to draw those kind of conclusions. But that's the analysis, it goes into your data and that's kind of the hidden layer between the output, your visualisation, your report and your raw data. You don't always see that happening.

Robin:
Yes, and I think what's fascinating is I think some people might be scared or nervous about some of the things you're talking about. But at the same time it's just high school maths.

Stewart:
Well for some people high school math is scary.

Robin:
Yes, I suppose high school English was scary for me, so, I can - yeah, alright!

Stewart:
No, it is. We sourced this fairly often but there was a study come out from I think the chief learning officer a couple of years back. You know, 50% of L&D departments do not have any data analysis skills at all. Anything. And we've talked to people that have very basic Excel skills: I can all put it up, I can format it, I can save it, I can maybe add in a cell A plus cell B to get a column C. I can do that kind of stuff. But it's until you start talking about pivot tables or charts or VLOOKUP's.

These are not super advanced things, but they're advanced enough that you've got to be generally comfortable with data to be able to make those work. And that's the skill level a lot of people are worried with is, "I don't know how to do a pivot table." We're like, "Oh, okay, I can show you that, that's not that hard." And since I kind of get the feeling I'm like, "Oh, drop this thing in and it gives me an average of all that then it makes sense. Okay I get that." They may not be able to reproduce it but they can follow along with what we've done, and I'm like, "That's the same as what we do over here in our analytics product." They're like, "Oh okay, I get that." At this point that's all they know.

Robin:
It's partly that, demystifying some of it as well.

Stewart:
Yes.

Robin:
Have you seen anyone doing any interesting correlation between learning data and business performance data?

Stewart:
I mean there's definitely some interesting things out there. To do this effectively you've got to get beyond the LMS. Because it just doesn't have context to business results. And so I think this is really where the xAPI, the experienced API, Tin Can framework methodology specification comes into play, if you will. In which now you can start to connect two different data points, potentially very different data points. So I've got a bunch of sales people going through a bunch of courses and I've captured all of that as learning experiences. And then I pull in all of their closed opportunities - and pull that information in, and now hopefully I can prove that, "Here are the results before training and here are the results after training." Maybe they've just got naturally better over time or maybe something else that's changed to make it better. Hopefully from a learning perspective you can make the correlation. Sales and training are always the easiest examples to draw.

But even if you look at some of the other LRS vendors, they've got interesting stories to tell.

Robin:
Yes. That example's also a nice one because it just has a nice correlation. And also just about bringing in one data point as well into your learning systems to be able to look at it as well. But sometimes you have to be a very complicated set up as well.

Stewart:
Yes. You find in an education they're always trying to make a connection between anything and successful students. And so they're always looking to connect in all these other systems and then see if they can start to draw lines in and say, "Okay, successful students in these courses tend to do these things." And so they spend a lot of time studying this information. It feeds a lot of very different data into your learning record store. And so then the learning analytics become very complicated. You may want to consider looking at a professional at this case and help you analyse that. But in that world, there's a lot of interesting things happening.

Robin:
Yes, it might be something I'd follow up in the future podcast. Because it seems like there is actually a whole lot more work around learning analytics happening in the education area than what there is in a workplace learning area. I think it's partly the difficulties around workplace learning and learning goals.

Stewart:
Yes. I think the education industry was first to jump on xAPI in the LRS. But the anecdotal data I'm hearing is that the adoption within corporate industry of xAPI is growing faster now than education.

Robin:
Yes. So Stewart, if someone wanted to get started in really improving their learning analytics I was thinking about learning dashboards. What do you think should be the first step?

Stewart:
Beyond taking intermediate level Excel courses? I mean I kind of joke from that, but it really is the starting point for most people in their analytics journey, right? So let somebody else do the hard work. Find a chunk of data that you can pull into Excel and figure out how to do some charts and pivot tables in there. And if you aren't comfortable with that there's tons of resources available for people in Excel to learn free resources, but also online courses and things like that. It's not too difficult to find Excel learning resources. We put a lot of even our own people, most of our L&D team internally have gone through continuing education courses on analytics. Most of them started with basic SQL courses, data structures type things. And we've put non technical people through those courses and they come out with such a more rounded education around how to extract data, and then the second part to those courses all then take you through basic visualisation concepts on how do I get this raw data into something visually that's meaningful to somebody beyond pie charts.

And so, you can probably very robustly, round out your skills by going through an Excel course, and then some basic data structure courses and almost every higher ed facility - not university level type stuff - community college type stuff will have data oriented continued education courses.

Robin:
Just, there's a few courses around preparing people to be data scientists as a jargon. You're not talking about that level. There's a different introduction to data analytics.

Stewart:
Yes even in that space, there is almost the DBA, the database analyst who's responsible for extracting data. They know all the tricks and can quickly extract data and build out data structures and things like that. The next layer on top of that is usually the data analyst, and they're gonna be fairly robust and skilled with different BI platforms. And then on top of that is the data scientist and that's the person that's digging deep and really looking for meaning in that data, looking for trends and patterns and things like that. That's a whole other skill level. I'm not sure your L&D department needs somebody to evolve to a data scientist level. But yes, having basic data knowledge skills, that seems like a required skill on the team somewhere.

Robin:
Cool. That was a really nice way of wrapping up the session in the podcast, Stewart. And really nice gems of advice. So thank you, for taking a deep dive with me around dashboards and learning analytics today. It's been great to have you on the podcast.

Stewart:
Thank you, appreciate it, this was a good time. Nice to talk to you.


comments powered by Disqus