Why Horizon Scanning matters

In February Paula Shaw, Academic Manager at the University of Derby Online Learning, and Julie Stone, Associate PVC External Affairs and Director of Online Learning, were invited to present a workshop at the EDUCAUSE Education Learning Initiative (ELI) in Anaheim, California. This three-day conference was a showcase of models and prototypes, providing participants with ideas for innovative learning design and future-focused technologies in practice, culminating with the unveiling of the 2019 Horizon Report.

What is the EDUCAUSE Horizon Report?

This is an annual report constructed by an expert panel of 100+ Thought Leaders in educational technology. Although 75% of the panel is made up of experts in the USA, the remaining 25% include professors from Europe and Asia to ensure that the resulting report has a global appeal. The report focuses on:

  • Key trends accelerating technology adoption in Higher Education
  • Significant challenges impeding technology adoption in Higher Education
  • Important developments in educational technology for Higher Education, which includes an estimated time to adoption

In the past, the third element, forecasting, has attracted some criticism. Each year the report starts by charting which of their predictions actually made it to a scalable solution, and as can be seen over the past 8 years very few of these innovations have made it to ‘Business as Usual’.

So what are the barriers?

In the 2019 Horizon Report, the authors have introduced a new section called ‘Fail or Scale?’ in this the expert panel provides a deeper analysis. For example, some trends are still developing but their timescales were perhaps too ambitious and other trends have stalled due to cost and imagined IT infrastructures not being sufficiently developed. This new section is welcomed as forecasting the future as we all know is not an exact science!

What was our contribution to the conference?

Even though terms such as online, distance and blended learning have been used for some time now, they remain ill-defined at all levels of educational discussion. This lack of clarity makes it hard for members of an institution to have meaningful conversations about infrastructure and pedagogical approaches. Let’s take the example of just one online module. At one end of the continuum there is a Virtual Learning Environment (VLE) platform through which students can see a space that houses PDFs and PowerPoints, maybe a few lecturer capture videos and they have a tool through which to submit their assignment – is this online learning? No, because the learning doesn’t take place in that space, it takes place in the physical classroom; these are supporting resources. Is this blended learning? Maybe, if students are directed to use the resources prior to attending the classroom. So at the other end of the continuum, what is online learning? Well, the same platform spaces can be used to house ‘content’ this is generally remastered lessons in a digital format, using the expertise of learning designers to ensure that there are no gaps in the information (because gaps can’t be corrected in a physical classroom). In addition to this, a range of networking tools are used to enable the academics and students to work together remotely.

So, we have some parity between the learning experiences, but what about the University experience? Campus based students can use other physical facilities such as the library and can meet-up with careers advisors, student wellbeing, study skills advisors – are these denied to online students? Well in many universities yes, because they haven’t worked out how to provide an equivocal university experience at a distance.

Our coaching workshop used a framework to help educators think beyond the classroom and consider two things that work like yin and yang to enhance the online experience:

  1. The institution’s educational planning e.g. engaging stakeholders, applying organisational strategies, questioning the quality of the digital infrastructure and
  2. The learning experience, considering pedagogic approaches that could seriously improve digital learning.

The third part of this framework places horizon scanning in the middle to imagine what this environment could look like in the short term (1 year or less), mid-term (2 – 3 years) and long-term (4 – 5 years).

At the end of the workshop, participants were asked to collaboratively create a handful of challenging questions to take back to their institutions, to advance and improve online learning.

Delegates at the conference were impressed in our discussions by the non-deficit model of online learning that we have designed at the University of Derby. Where we express our commitment to students gaining access to a range of services that any full-time student would expect.

What did we learn from others?

We saw research from across the landscape of higher education providers in the USA, which focussed on the impact of learning design within the curriculum particularly where it was delivered fully online. The research demonstrated, that where learning designers were utilised alongside the academic subject matter expert, that significant results occurred engaging students in a range of different learning practices rather than just reading content and that this research also demonstrated improved student success as a result. This provided further validation of our online learning approach of involving academics, learning designers, media and content developers in designing and creating our digital learning experience for online learners. We also believe this approach is part of the reason why we enjoy a 90%+ retention rate.

It is clear that learning analytics is viewed as valuable data but it is still uncertain how to cut that data to make it meaningful to improving student success. On its own, without further analysis and remediating intervention, it is just data. Educators we spoke to at the conference were trying out different ways to easily represent the data, cutting across different learning tools, to represent individual student dashboards. They felt that until this could be represented simply they wouldn’t be able to make meaningful interventions. In some institutions, specialist data analysts were being employed to highlight trends and work with educators to present the data for them and their students. Taking learning analytics beyond this level, there was a great deal of excitement about adaptive learning. This includes sharing results of micro tests with students and offering additional learning packages to fill their specific knowledge gaps. However, educators were concerned about the cost of creating additional remedial content; some were looking to Open Educational Resources (OERs) to plug the gap.

The third related embryonic idea presented there; bringing data and remedial content together was the use of Personal Assistants e.g. chatbots such as Alexa, within the VLE. Running alongside the digital content the chatbot asks ‘Do you want more information to help you understand?’ ‘Do you want to read a more complex, research informed answer?’ Yet, how to draw in the right level and type of remedial content remains a wicked challenge until IT algorithms become more effective.

This level of personalisation as you can see is intertwined with data, algorithms and remedial/advanced content and it is this unpredictability of cost, time and infrastructure that makes it so difficult to guestimate the future. What we do know however, is that these ideas are bold, brilliant and future-focused and it is embedded within our values to explore and try to solve these wicked challenges. Collaborating with a global network of like-minded educators can not only help us to understand the challenges better but help to solve them quicker.

Join the conversation

You might also like