Learning analytics: engaging stakeholders

4372720779_11152c9217_z

Dubwise Version, Flickr Capture creative commons

Week 25 Activity 24: Outline a workshop to develop a vision for learning analytics in the organisation.

Creating our Learning Analytics Vision Workshop: 9.00 to 13.00

Attendees:

  • Senior VP of Organisational Development and Talent
  • Senior VP of HR Development
  • Head of Leadership Development and Learning Technology
  • Organisational Data Manager
  • Training Design Manager
  • Consultant of Learning Technology
  • IT business partner

Agenda

Introduction exercise – 9.00 to 9.30

  • Learning analytics myths and realities
  • Our role in creating the Vision

Introduction to learning analytics at a macro level 9.30 to 10.15am

  • What learning analytics makes possible – video
  • Who is doing what – brief analysis using short case studies and video

Break

Why bother? What learning analytics will enable in the organisation and how this links to our corporate vision. 10.35 to 11.15

  • How Learning Analytics will benefit our people and our organisation
    • Debate in groups: opportunities and threats
    • Confirm benefits and watch outs

Key issues to be addressed by our Vision 11.15 to 11.45

  • Group work to identify key organisational issues (including blockers/processes) our Vision will need to address

Current Reality v Future Vision 11.45 to 12.45

  • Discussion and visioning activity to build our vision and identify our gaps.
  • Confirm Vision

Next steps 12.45 to 13.00

Visualising social networks

7975205041_7a5e4b65ff_z

3D Social Networking Attributed to ‘StockMonkeys.com’ 

This blog is based on a paper by Bakharia et al. (2009) Social networks adapting pedagogical practice: SNAPP.

What can be revealed by a network diagram of students discussions?

  1. identify disconnected and therefore at risk students
    this is an early warning system and could be used to support students with low levels of engagement and to encourage engagement within the broader group by giving additional support.
    potential problems – the learner may not appreciate the tutor intervention and feel pressurised within the learning environment with the risk of further disconnection
  2. identify key information brokers
    individual learners could identify key people in their network and connect with them, ‘follow’ them and discuss issues and debate in the knowledge they will learn from interaction with a broader group of students
    potential problems – key information brokers may not broker the ‘effective’ debate and may be more vocal on issues other than those which are learning related
  3. identify potentially high and low performing students so learning interventions can be planned
    tutors can design appropriate interventions, allowing a tailored support approach.
    potential problems – learners may resent being ‘monitored’ and being selected for additional support.
  4. indicate the extent to which a learning community is developing
    harness the learning created in a community of practice and create deeper meaning for students
    potential problems – assumes the creation of a learning community is positive for everyone in the group and may influence peer and tutor perception of the commitment of learners who contribute least.
  5. provide before and after snapshot of interactions occurring pre and post learning interventions
    learning designers can ‘test’ whether the learning intervention met the learning objectives and learners can assess their progress
    potential problems – the number rather than the quality of interactions is revealed.
  6. provide timely data for students to benchmark their performance and engagement against peers
    learners can see how they are performing in relation to their peers and seek further support if required
    potential problems – the level of interaction rather than the quality of interaction is revealed, also this may impact on a learner’s confidence level with the result they stop contributing altogether.

 

References

Bakharia, A., Heathcote, E. and Dawson, S. (2009) ‘Social networks adapting pedagogical practice: SNAPP’ in Atkinson, R.J. and McBeath, C. (eds) Same Places, Different Spaces, Proceedings ascilite 2009, 26th Annual ascilite International Conference, Auckland 6–9 December 2009, Auckland, The University of Auckland, Auckland University of Technology and Australasian Society for Computers in Learning in Tertiary Education (ascilite); also available online at http://www.ascilite.org/conferences/auckland09/procs/bakharia-poster.pdf (Accessed 17th July 2016)

Checkpoint analytics and process analytics

7649183772_a98c86b4b9_z

James Royal – Lawson: Flickr Capture creative commons

This activity applies checkpoint analytics and process analytics to the learning design of part of the H817 module at the Open University . I have chosen to review Block 1.

Getting to know each other Examining innovation Innovation and eLearning Evaluating innovation Horizon scanning
Where checkpoint analytics might be used

 

Introducing self

 

Setting up a blog

 

Setting up and using OU Live

Are OER both open and innovative –websites visited

Investigating a learning theory – group wiki completed and links visited Choosing and evaluating a specific technology – group report production and links visited Significant new technologies – creation of a table and links visited
Why checkpoint analytics might be used Is the learner engaging with the resources and activities required within the module design.
Where process analytics might be used

 

How to document reflection

Reading an article and searching for relevant references

Application of learning through  blog/forum debate and comments

Are OER both open and innovative – thoughts on questions posed posted to the TGF

Innovation in your context – application of concepts to own world described in TGF

A theory for elearning – stating reasons for agreement/disagreement in TGF

Investigating a learning theory working in a team to develop knowledge

Connectivism – a summary of personal analysis of connectivism completed

Choosing and evaluating a specific technology – group report content Application of thinking about new technologies to learner’s  own environment. Recorded in blog/TGF
Why process analytics might be used Are the learning objectives of the module being realised?

Is the learner understanding the module material – evidenced through analysis, discussion and application to own environment

Who would benefit Checkpoints analytics will help the Tutor ensure the learner is on track and prompt for progress if appropriate. There is also the opportunity early in this module to assess individual’s competence with technology and have insight into where additional support may be required later.

Process analytics initially help learners and the tutor get to know each other, allow the tutor to explore the ways in which learners are engaging and give insight into learner understanding.

Impact Learners understand each others’ environments, values and perspectives very quickly through collaboration which set the scene for the rest of the module

The tutor has an early view on learner’s study patterns, values and perspectives

The tutor can prompt thinking and debate around topics to ensure learning objectives are being achieved.

I would prioritise checkpoint analytics at key points in this block

  1. Introductions – to monitor early engagement and as an early warning sign that help may be needed.
  2. Group Wiki created in week 3 – to monitor engagement in collaborative tasks and to inform group choice for week 4 activity
  3. Group report created in week 4 – again to evaluate engagement (thinking ahead as early warning sign for Block 3) and to identify where the tutor may need to check in with individual learners

I would prioritise process analytics at the following points

  1. Week 1 – are learners debating and commenting?
  2. Week 2 – are learners applying their research and reading to their own environment?
  3. Week 3 – do learners understand the key debates around learning theories?
  4. Weeks 4 & 5- can learners evaluate the relevance of a specific technology

It strikes me that my process analytics priorities are focused on the  assessment requirements which are tested in an assignment at the end of the block.

Learning analytics and learning design

This blog is based on reading ‘Informing pedagogical action: aligning learning analytics with learning design’ (Lockyer et al., 2013)

The authors claim that data collected is underused or even unused due to a lack of an underlying framework. They propose a framework called checkpoint and process analytics. They argue that this framework can be applied to provide information on the impact of learning activities.

Checkpoint analytics

The student has accessed the relevant resources of the learning design e.g shown through log-ins and pages visited. Checkpoint analytics measure which files diagrams etc the learner has accessed (these are considered to be pre-requisites to the learning).

The value of checkpoint analytics lies in providing teachers with insight into whether learners are progressing through the planned learning sequence.

Some pedagogical actions

  • reports of student log-ins can be used to offer prompts for late starters
  • teacher can initiate action
  • student participation levels can be reviewed to see whether all are particpating in activities where they are all required to.

Process analytics

The student has processed the learning and applies information and knowledge e.g. shown through the tasks completed, forum postings and discussions. Process analytics measures whether the learner carries out the tasks they are expected to, using the provided learning resources to do this.

The value of process analytics lies in providing teachers with insight into engagement levels of individual learners, which networks they have built and therefore whether they have support structure or not. They also have value in determining the level of understanding.

Some pedagogical actions

  • ideas are shared and discussed – teacher can monitor the level of understanding
  • social network analysis allows identification of the effectiveness of each groups’ interaction process

I feel these categories are more useful than previously explored (data driven and pedagogical driven) This is a practical and pragmatic framework and feels  more user friendly.

References:

Lockyer, L., Heathcote, E. and Dawson, S. (2013) ‘Informing pedagogical action: aligning learning analytics with learning design’, American Behavioral Scientist, vol. 57, no. 10; also available online at http://libezproxy.open.ac.uk/ login?url=http://dx.doi.org/ 10.1177/ 0002764213479367 (accessed 14 July 2016).

MAODE, H817, Open University, Block 4 Activity 11

Learning analytics in the library

6875075891_6511a40b2a_z

Gord Webster Flickr Capture creative commons

H817 Week 22 Activity 10

This week I’m looking at learning analytics in the library and have identified the following data collected by University libraries taking part in the Library Impact Data Project.

Data collected:

  • usage by discipline
  • number of items borrowed
  • items checked out from the library
  • number of library visits, measured via gate entries
  • number of hours in a year in which a student was logged into a library PC
  • hours logged into e-resources
  • total number of e-resources accessed
  • identification of most visited resources
  • pages visited and how long a student stays
  • pathways students take through the resources
  • student attainment
  • demographics of users
  • course discipline and method of study
  • final degree results
  • social networks and activity

Five ways in which these data sets might support analytics that could lead to the improvement of learning and or teaching.

  1. Identify the relationship between student attainment and library usage and potentially identify at risk students by analysing library usage. The teacher can then adjust their approach to at risk individuals.
  2. To improve the quality and usefulness of eLearning resources through analysing the most visited resources and the length of time students spent on these pages.
  3. Create more meaningful links between resources to promote the most useful and to lead students to other relevant resources –  through analysing path ways students take.
  4. Track the learner journey to design user friendly and engaging content.
  5. Identify the knowledge gaps in individual students to highlight the need for academic intervention.

MAODE , H817, Open University, Block 4, Activity 10

Pedagogy driven analytics

6881756724_344b03432f_z
Ted Major. Dilemma

Capture creative commons

Analytics and pedagogy

The Innovating Pedagogy report, Sharples et al (2015) lists 10 types of pedagogy that may transform education in the future. Here I explore three of them and look at how they could be supported by learning analytics.

Crossover learning

Crossover learning links educational content to those things that matter to learners in their everyday lives. It takes place in informal settings such as museums and field trips and these experiences are enriched by adding in knowledge from the classroom whilst educational experience is enriched by this everyday experience. These connected experiences spark further interest and motivation to learn.

Learners could be supported through analytics on users flow – quickly accessing paths they have previously followed. Teachers could be supported through being able to see where learners were active and how they related their informal surroundings back to classroom learning. This would enable individual follow-up and customisation.

Incidental learning

Incidental learning is informal learning that takes place outside the formal classroom and is unplanned and unintentional. Learning analytics could support the capturing of this learning through analysis of websites visited, how long the visits lasted, which pages were viewed and what was downloaded. Social networking could be analysed to help highlight the breadth of social learning taking place and device use could be assessed to understand which devoces were being used most for informal learning. Ultimately, through analysis, this type of learning may become more ‘visible’ to the leaner and offer a broader picture of the whole learning experience.

Adaptive teaching

Adaptive teaching recognises the unique qualities of learners and aims to offer a bespoke learning experience which engages each individual. Learning analytics could help educators offer a more bespoke learning experience rather than a one size fits all approach. Analysing learners’ previous and current learning creates a personalised path through educational content e.g. suggesting where to start new content and when to revisit and review previously viewed content. Learning analytics can also show and monitor progress. Educators could use this knowledge to ensure their students are on track and as an early warning sign for problems. It may also be possible to deduce which parts of the content are working and overall where students struggle and so can be harnessed to develop and improve the educational content and the overall learning experience.

H817 Week 22 Activity 8: Analytics and Pedagogy

Google analytics – use in education

7454479488_9cf64433d6_z

joiseyshowaa. available here   Capture creative commons

H817 MA ODE Open University, Block 4 Week 22 Activity 7

On reviewing Google Analytics it becomes apparent that these data have potential to generate learning analytics.

Here are some examples

Active users: tracking learners to see that they are still active and to give early warning indicators of potential drop outs.

User explorer: would allow individual tracking through the learning experience.

Cohort Analysis: defining common learner characteristics such as first time students versus seasoned students would allow cohort behaviours to be compared.

Demographics: understanding the age and gender composition gives the opportunity to tailor content and interventions.

Geo (language and location): helps in understanding any potential difficulties experienced by non native language speakers.

Behaviour: measuring how often learners return to certain learning activities or links could give insight into what needs improvement or what learners preferences are.

Technology and Mobile: understanding how learners access content could be useful in ensuring design is fit for purpose and planning for the future.

User flow: understanding the path learners take through a site could enable designers to improve the learner experience

The challenges associated with this include

  • data protection of learners information and online behaviour
  • ethical issues around the gathering and application of some of the above data e.g. Geo and User explorer
  • the ability to gather data against every possible analytic may lead to data being gathered because it’s there rather than being gathered for a purpose
  • data can only show what has happened, skill is required in identifying trends for planning e.g. technology and mobile.

 

Introducing learning analytics

4038500333_94797049a9_z

Norquist. Photograph available here  Capture creative commons

Based on Learning analytics: drivers, developments and challenges (Ferguson, 2012)

Learning analytics is a new field that has emerged in the last decade with roots in business intelligence, web analytics, educational data mining and recommender systems.

The goals of what can be achieved and how these goals will be achieved still has to be defined.

Learning analytics are different from other related fields of academic analysis and Educational Data Mining (EDM)

There are a number of definitions of learning analytics. The current prevalent definition was set out in a call for papers for the first international Conference on Learning Analytics and Knowledge (LAK2011)

” Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs

Drivers

There are a number of factors driving the development of learning analytics

  • How business uses analytics to extract value from data sets (big data) to drive recommendation engines, identify patterns of behaviour and develop advertising campaigns
  • Widespread introduction of LMSs is creating larger data sets. This data is being collated but the reporting and visualisation of this has been largely non-existent.
  • Online learning take-up has increased
  • Increasing demand for educational institutions to measure and demonstrate improved performance
  • Emergence of different interest groups; government, educational institutes and teachers/learners

A bit of history

  • 1979: Open University could reflect on 10 years monitoring the progress of distance students course by course
  • 1999: it was slowly becoming clear that collaborative online learning could take place (Dillenbourg, 1999)
  • 2000: EDM (Educational Data Mining) begins to emerge from the analysis of student-computer interaction with a strong emphasis on learning and teaching. In 2007 Romero and Ventura defined the goal of EDM as ‘turning learners into effective better learners’
  • 2001: Second generation web opened up new ways of collecting web content from various sources, processing it and exchanging the results with other programmes (Berners-Lee et al., 2001)
  • In contrast the early use of the term learning analytics referred to business intelligence about e-learning (Mitchell and Costello, 2000)
  • 2003 onwards: socially and pedagogical approaches to analytics began to emerge. Social Network Analysis (SNA) was a significant development. SNA is the investigation of networks and can be used to ‘investigate and promte collaborative and co-operative connections between learners, tutors and resources helping them to extend and develop their capabilities’
  • 2008: Pedagogic theory starts to emerge strongly as an approach to optimising and understanding learning.

Political and economic drivers

Measuring of the quality of education to meet the challenge of declining education standards principally in the USA. ‘Academic analytics’ began to evolve to link data sets with improved educational decision making.

The field is rapidly expanding

In 2008 analytics and EDM split.

Analytical tools are rapidly developing and enabling different types of analysis e.g. LOCO-Analyst which provides feedback focused on the quality of the learning process.

With tools becoming more powerful ethics and privacy issues begin to emerge

In 2010 the field of analytics splits again with learning analytics gradually breaking away from academic analytics. Siemens presented the first early definition in 2010 which was refined and has become the current prevalent definition as described earlier in this blog.

Put simply

  • EDM focused on the technical challenge
  • Learning analytics focused on the educational challenge (optimising opportunities for learning online)
  • Academic analytics focused on the political/educational challenge

Overlaps between them still remain though there have been further attempts to distinguish between them. Long and Siemens (2011)

In 2012 learning analytics were identified as a technology to watch in the NCM Horizon Report.

New tools such as GRAPPLE can now extract data from across an entire PLE

Learning analytics are distinguished by their concern for providing value to learners and employed to optimise both learning from and in the environments which it takes place

Reference

Ferguson, R. (2012) ‘Learning analytics: drivers, developments and challenges’, International Journal of Technology Enhanced Learning (IJTEL), vol. 4, nos. 5/6, pp. 304–17; also available online athttp://oro.open.ac.uk/ 36374/ (accessed 6 July 2016).

 

 

 

Learning Analytics Definitions – a look through history

Learning Analytics

A review of definitions in Wikipedia shows how definitions of Learning analytics is developing.

Early definitions focused on Learning Analytics having a clear learner perspective, over time this seems to have developed into a perspective which places the educator at the core of Learning Analytics.

2010 Siemens: LA is the use of intelligent data, learner data and analysis models.

Purpose

  1. provide learners relevant content resources and social connections
  2. predict learner success
  3. perform necessary interventions

2011 Kozleski: LA is the use of intelligent data, learner data and analysis models

Purpose

  1. discover information and social connections for predicting and advising people’s learning

2012 Culatta: LA is the measurement, collection, analysis and reporting of data about learners and their contexts

Purpose

  1. Understanding and optimising learning and the environments in which it occurs

2012 – 2014 Definition remained same

July 2016: LA is the use of intelligent data, learner- produced data and analysis models

Purpose

  1. to discover information and social connections for predicting and advising people’s learning

 

 

The Collaborative Project

Collaboration

Yobie. Collaboration available here  Capture creative commons

I’ve been away from blogging for the last few weeks, totally absorbed by a six week long collaborative online project (Block 3 of H817).

My team and I were tasked with building a resource using Web 2.0 technologies to support reflective practice using a digital diary in a vocational scenario. We followed the Learning Design Studio approach to work individually and collaboratively.

It’s been a difficult but rewarding time and the output is still being assessed so only time will tell how successful we have been in meeting the brief.

For now I feel a sense of relief that I am now on the final leg of my H817 journey – Block 4 Learning Analytics.