Digital Assistants and Informed Decision Making

  

The ability to solve the myriad of day-to-day problems that are encountered on our campuses is getting harder and this is exacerbated by the lack of time to solve these problems. So the level of urgency rises. The coupling of rising complexity and urgency all too often means that students, teachers and support teams are finding it harder to make informed decisions to support their studies or work. One of the factors that contributes to complexity is the growing volume of data that permeates through our education institutions. This is especially true if we see every facet of a campus as entities that either produce, manipulate or consume data. The traditional tools that are used to manage data such as management information systems, CRM systems, business intelligence systems, learning management systems and data dashboards all too often fall short. In many cases, they exacerbate the problems that are associated with complexity and urgency. [1] This problem was noted during the 1950s by Norbert Wiener who stated that whilst communication mechanisms do become more efficient, they are subject to increasing levels of entropy. [2]

Wiener also stated that external agents could be introduced to control entropy. Campus digital assistants can be classed as one of these external agents; an agent that can be used to mitigate against complexity and urgency. Mark Weiser and John Seely Brown later coined the term calm technology to describe the nature of these services. [3]

How will campus digital assistants inform the decisions made by students, teachers and support teams?

How can campus digital assistants reduce entropy and inform decision making?
Campus digital assistants mature over three distinct stages of development. During the first stage students, teachers and support teams are presented with contextualised responses as they pose a myriad of questions to their digital assistant. The ability to garner on-demand information that is contextualised to the individual is a key trait of these services. During the second stage, the campus digital assistant acts with a degree of agency. The agent or the digital assistant behaves in a deterministic manner; namely, if condition A is observed then carry out action B to support the student, teacher or campus support worker. Examples include the production of weekly student report cards; nudging students and teachers with timely and contextualised information, advice and guidance; or praising students about a recent assignment grade. During the third stage, the campus digital assistant starts to make use of probabilistic modelling to inform its actions; especially when the number of possible actions is very large. The ability to use probabilistic modelling enables campus digital assistants to be used to support a broad range of tasks which include offering real-time feedback to students as they undertake formative assessment tasks; offering students information, advice and guidance as they progress with their studies; or adapting teaching, learning and assessment materials within an online tutorial. [4] In each stage of development the campus digital assistant enables students, teachers and support teams to garner information and insights that allow them to make informed decisions to further their studies or work across the campus.

The nature of the education setting means that it is of the utmost importance that the actions and decisions making capabilities of campus digital assistants are explainable and that their conduct is carried out on principles that are acceptable to all the stakeholders across the campus. These services should also be seen as being complementary to our world view rather than better than us. [5] If we bear this in mind, the campus digital assistant will prove to be an indispensable tool to aid and support the countless decisions that are made on our campuses each day. Further evidence needs to be gathered to ascertain the effectiveness of the campus digital assistant to inform decision making and to see if their use in this manner leads to improved student outcomes.

Notes:

  1. Aftab Hussain, Making the Complex Simple (24 October 2018)
  2. Norbert Wiener, The Human Use of Human Beings (1950)
  3. Mark Weiser and John Seely Brown, Designing Calm Technology (1995)
  4. James Hensman, Computer says you need an umbrella: Probabilistic Models for Complex Decision Making
  5. Dr. Jess Whittlestone, AI and Improving Human Decision-Making (21 May 2019)