Learning analytics: engaging stakeholders

4372720779_11152c9217_z

Dubwise Version, Flickr Capture creative commons

Week 25 Activity 24: Outline a workshop to develop a vision for learning analytics in the organisation.

Creating our Learning Analytics Vision Workshop: 9.00 to 13.00

Attendees:

  • Senior VP of Organisational Development and Talent
  • Senior VP of HR Development
  • Head of Leadership Development and Learning Technology
  • Organisational Data Manager
  • Training Design Manager
  • Consultant of Learning Technology
  • IT business partner

Agenda

Introduction exercise – 9.00 to 9.30

  • Learning analytics myths and realities
  • Our role in creating the Vision

Introduction to learning analytics at a macro level 9.30 to 10.15am

  • What learning analytics makes possible – video
  • Who is doing what – brief analysis using short case studies and video

Break

Why bother? What learning analytics will enable in the organisation and how this links to our corporate vision. 10.35 to 11.15

  • How Learning Analytics will benefit our people and our organisation
    • Debate in groups: opportunities and threats
    • Confirm benefits and watch outs

Key issues to be addressed by our Vision 11.15 to 11.45

  • Group work to identify key organisational issues (including blockers/processes) our Vision will need to address

Current Reality v Future Vision 11.45 to 12.45

  • Discussion and visioning activity to build our vision and identify our gaps.
  • Confirm Vision

Next steps 12.45 to 13.00

Advertisements

Visualising social networks

7975205041_7a5e4b65ff_z

3D Social Networking Attributed to ‘StockMonkeys.com’ 

This blog is based on a paper by Bakharia et al. (2009) Social networks adapting pedagogical practice: SNAPP.

What can be revealed by a network diagram of students discussions?

  1. identify disconnected and therefore at risk students
    this is an early warning system and could be used to support students with low levels of engagement and to encourage engagement within the broader group by giving additional support.
    potential problems – the learner may not appreciate the tutor intervention and feel pressurised within the learning environment with the risk of further disconnection
  2. identify key information brokers
    individual learners could identify key people in their network and connect with them, ‘follow’ them and discuss issues and debate in the knowledge they will learn from interaction with a broader group of students
    potential problems – key information brokers may not broker the ‘effective’ debate and may be more vocal on issues other than those which are learning related
  3. identify potentially high and low performing students so learning interventions can be planned
    tutors can design appropriate interventions, allowing a tailored support approach.
    potential problems – learners may resent being ‘monitored’ and being selected for additional support.
  4. indicate the extent to which a learning community is developing
    harness the learning created in a community of practice and create deeper meaning for students
    potential problems – assumes the creation of a learning community is positive for everyone in the group and may influence peer and tutor perception of the commitment of learners who contribute least.
  5. provide before and after snapshot of interactions occurring pre and post learning interventions
    learning designers can ‘test’ whether the learning intervention met the learning objectives and learners can assess their progress
    potential problems – the number rather than the quality of interactions is revealed.
  6. provide timely data for students to benchmark their performance and engagement against peers
    learners can see how they are performing in relation to their peers and seek further support if required
    potential problems – the level of interaction rather than the quality of interaction is revealed, also this may impact on a learner’s confidence level with the result they stop contributing altogether.

 

References

Bakharia, A., Heathcote, E. and Dawson, S. (2009) ‘Social networks adapting pedagogical practice: SNAPP’ in Atkinson, R.J. and McBeath, C. (eds) Same Places, Different Spaces, Proceedings ascilite 2009, 26th Annual ascilite International Conference, Auckland 6–9 December 2009, Auckland, The University of Auckland, Auckland University of Technology and Australasian Society for Computers in Learning in Tertiary Education (ascilite); also available online at http://www.ascilite.org/conferences/auckland09/procs/bakharia-poster.pdf (Accessed 17th July 2016)

Checkpoint analytics and process analytics

7649183772_a98c86b4b9_z

James Royal – Lawson: Flickr Capture creative commons

This activity applies checkpoint analytics and process analytics to the learning design of part of the H817 module at the Open University . I have chosen to review Block 1.

Getting to know each other Examining innovation Innovation and eLearning Evaluating innovation Horizon scanning
Where checkpoint analytics might be used

 

Introducing self

 

Setting up a blog

 

Setting up and using OU Live

Are OER both open and innovative –websites visited

Investigating a learning theory – group wiki completed and links visited Choosing and evaluating a specific technology – group report production and links visited Significant new technologies – creation of a table and links visited
Why checkpoint analytics might be used Is the learner engaging with the resources and activities required within the module design.
Where process analytics might be used

 

How to document reflection

Reading an article and searching for relevant references

Application of learning through  blog/forum debate and comments

Are OER both open and innovative – thoughts on questions posed posted to the TGF

Innovation in your context – application of concepts to own world described in TGF

A theory for elearning – stating reasons for agreement/disagreement in TGF

Investigating a learning theory working in a team to develop knowledge

Connectivism – a summary of personal analysis of connectivism completed

Choosing and evaluating a specific technology – group report content Application of thinking about new technologies to learner’s  own environment. Recorded in blog/TGF
Why process analytics might be used Are the learning objectives of the module being realised?

Is the learner understanding the module material – evidenced through analysis, discussion and application to own environment

Who would benefit Checkpoints analytics will help the Tutor ensure the learner is on track and prompt for progress if appropriate. There is also the opportunity early in this module to assess individual’s competence with technology and have insight into where additional support may be required later.

Process analytics initially help learners and the tutor get to know each other, allow the tutor to explore the ways in which learners are engaging and give insight into learner understanding.

Impact Learners understand each others’ environments, values and perspectives very quickly through collaboration which set the scene for the rest of the module

The tutor has an early view on learner’s study patterns, values and perspectives

The tutor can prompt thinking and debate around topics to ensure learning objectives are being achieved.

I would prioritise checkpoint analytics at key points in this block

  1. Introductions – to monitor early engagement and as an early warning sign that help may be needed.
  2. Group Wiki created in week 3 – to monitor engagement in collaborative tasks and to inform group choice for week 4 activity
  3. Group report created in week 4 – again to evaluate engagement (thinking ahead as early warning sign for Block 3) and to identify where the tutor may need to check in with individual learners

I would prioritise process analytics at the following points

  1. Week 1 – are learners debating and commenting?
  2. Week 2 – are learners applying their research and reading to their own environment?
  3. Week 3 – do learners understand the key debates around learning theories?
  4. Weeks 4 & 5- can learners evaluate the relevance of a specific technology

It strikes me that my process analytics priorities are focused on the  assessment requirements which are tested in an assignment at the end of the block.

Learning analytics and learning design

This blog is based on reading ‘Informing pedagogical action: aligning learning analytics with learning design’ (Lockyer et al., 2013)

The authors claim that data collected is underused or even unused due to a lack of an underlying framework. They propose a framework called checkpoint and process analytics. They argue that this framework can be applied to provide information on the impact of learning activities.

Checkpoint analytics

The student has accessed the relevant resources of the learning design e.g shown through log-ins and pages visited. Checkpoint analytics measure which files diagrams etc the learner has accessed (these are considered to be pre-requisites to the learning).

The value of checkpoint analytics lies in providing teachers with insight into whether learners are progressing through the planned learning sequence.

Some pedagogical actions

  • reports of student log-ins can be used to offer prompts for late starters
  • teacher can initiate action
  • student participation levels can be reviewed to see whether all are particpating in activities where they are all required to.

Process analytics

The student has processed the learning and applies information and knowledge e.g. shown through the tasks completed, forum postings and discussions. Process analytics measures whether the learner carries out the tasks they are expected to, using the provided learning resources to do this.

The value of process analytics lies in providing teachers with insight into engagement levels of individual learners, which networks they have built and therefore whether they have support structure or not. They also have value in determining the level of understanding.

Some pedagogical actions

  • ideas are shared and discussed – teacher can monitor the level of understanding
  • social network analysis allows identification of the effectiveness of each groups’ interaction process

I feel these categories are more useful than previously explored (data driven and pedagogical driven) This is a practical and pragmatic framework and feels  more user friendly.

References:

Lockyer, L., Heathcote, E. and Dawson, S. (2013) ‘Informing pedagogical action: aligning learning analytics with learning design’, American Behavioral Scientist, vol. 57, no. 10; also available online at http://libezproxy.open.ac.uk/ login?url=http://dx.doi.org/ 10.1177/ 0002764213479367 (accessed 14 July 2016).

MAODE, H817, Open University, Block 4 Activity 11

Learning analytics in the library

6875075891_6511a40b2a_z

Gord Webster Flickr Capture creative commons

H817 Week 22 Activity 10

This week I’m looking at learning analytics in the library and have identified the following data collected by University libraries taking part in the Library Impact Data Project.

Data collected:

  • usage by discipline
  • number of items borrowed
  • items checked out from the library
  • number of library visits, measured via gate entries
  • number of hours in a year in which a student was logged into a library PC
  • hours logged into e-resources
  • total number of e-resources accessed
  • identification of most visited resources
  • pages visited and how long a student stays
  • pathways students take through the resources
  • student attainment
  • demographics of users
  • course discipline and method of study
  • final degree results
  • social networks and activity

Five ways in which these data sets might support analytics that could lead to the improvement of learning and or teaching.

  1. Identify the relationship between student attainment and library usage and potentially identify at risk students by analysing library usage. The teacher can then adjust their approach to at risk individuals.
  2. To improve the quality and usefulness of eLearning resources through analysing the most visited resources and the length of time students spent on these pages.
  3. Create more meaningful links between resources to promote the most useful and to lead students to other relevant resources –  through analysing path ways students take.
  4. Track the learner journey to design user friendly and engaging content.
  5. Identify the knowledge gaps in individual students to highlight the need for academic intervention.

MAODE , H817, Open University, Block 4, Activity 10

Pedagogy driven analytics

6881756724_344b03432f_z
Ted Major. Dilemma

Capture creative commons

Analytics and pedagogy

The Innovating Pedagogy report, Sharples et al (2015) lists 10 types of pedagogy that may transform education in the future. Here I explore three of them and look at how they could be supported by learning analytics.

Crossover learning

Crossover learning links educational content to those things that matter to learners in their everyday lives. It takes place in informal settings such as museums and field trips and these experiences are enriched by adding in knowledge from the classroom whilst educational experience is enriched by this everyday experience. These connected experiences spark further interest and motivation to learn.

Learners could be supported through analytics on users flow – quickly accessing paths they have previously followed. Teachers could be supported through being able to see where learners were active and how they related their informal surroundings back to classroom learning. This would enable individual follow-up and customisation.

Incidental learning

Incidental learning is informal learning that takes place outside the formal classroom and is unplanned and unintentional. Learning analytics could support the capturing of this learning through analysis of websites visited, how long the visits lasted, which pages were viewed and what was downloaded. Social networking could be analysed to help highlight the breadth of social learning taking place and device use could be assessed to understand which devoces were being used most for informal learning. Ultimately, through analysis, this type of learning may become more ‘visible’ to the leaner and offer a broader picture of the whole learning experience.

Adaptive teaching

Adaptive teaching recognises the unique qualities of learners and aims to offer a bespoke learning experience which engages each individual. Learning analytics could help educators offer a more bespoke learning experience rather than a one size fits all approach. Analysing learners’ previous and current learning creates a personalised path through educational content e.g. suggesting where to start new content and when to revisit and review previously viewed content. Learning analytics can also show and monitor progress. Educators could use this knowledge to ensure their students are on track and as an early warning sign for problems. It may also be possible to deduce which parts of the content are working and overall where students struggle and so can be harnessed to develop and improve the educational content and the overall learning experience.

H817 Week 22 Activity 8: Analytics and Pedagogy

Google analytics – use in education

7454479488_9cf64433d6_z

joiseyshowaa. available here   Capture creative commons

H817 MA ODE Open University, Block 4 Week 22 Activity 7

On reviewing Google Analytics it becomes apparent that these data have potential to generate learning analytics.

Here are some examples

Active users: tracking learners to see that they are still active and to give early warning indicators of potential drop outs.

User explorer: would allow individual tracking through the learning experience.

Cohort Analysis: defining common learner characteristics such as first time students versus seasoned students would allow cohort behaviours to be compared.

Demographics: understanding the age and gender composition gives the opportunity to tailor content and interventions.

Geo (language and location): helps in understanding any potential difficulties experienced by non native language speakers.

Behaviour: measuring how often learners return to certain learning activities or links could give insight into what needs improvement or what learners preferences are.

Technology and Mobile: understanding how learners access content could be useful in ensuring design is fit for purpose and planning for the future.

User flow: understanding the path learners take through a site could enable designers to improve the learner experience

The challenges associated with this include

  • data protection of learners information and online behaviour
  • ethical issues around the gathering and application of some of the above data e.g. Geo and User explorer
  • the ability to gather data against every possible analytic may lead to data being gathered because it’s there rather than being gathered for a purpose
  • data can only show what has happened, skill is required in identifying trends for planning e.g. technology and mobile.

 

Learning Analytics Definitions – a look through history

Learning Analytics

A review of definitions in Wikipedia shows how definitions of Learning analytics is developing.

Early definitions focused on Learning Analytics having a clear learner perspective, over time this seems to have developed into a perspective which places the educator at the core of Learning Analytics.

2010 Siemens: LA is the use of intelligent data, learner data and analysis models.

Purpose

  1. provide learners relevant content resources and social connections
  2. predict learner success
  3. perform necessary interventions

2011 Kozleski: LA is the use of intelligent data, learner data and analysis models

Purpose

  1. discover information and social connections for predicting and advising people’s learning

2012 Culatta: LA is the measurement, collection, analysis and reporting of data about learners and their contexts

Purpose

  1. Understanding and optimising learning and the environments in which it occurs

2012 – 2014 Definition remained same

July 2016: LA is the use of intelligent data, learner- produced data and analysis models

Purpose

  1. to discover information and social connections for predicting and advising people’s learning

 

 

The Collaborative Project

Collaboration

Yobie. Collaboration available here  Capture creative commons

I’ve been away from blogging for the last few weeks, totally absorbed by a six week long collaborative online project (Block 3 of H817).

My team and I were tasked with building a resource using Web 2.0 technologies to support reflective practice using a digital diary in a vocational scenario. We followed the Learning Design Studio approach to work individually and collaboratively.

It’s been a difficult but rewarding time and the output is still being assessed so only time will tell how successful we have been in meeting the brief.

For now I feel a sense of relief that I am now on the final leg of my H817 journey – Block 4 Learning Analytics.

Collaborative working – first reflections

Thinker

Thinker Mirror Reflection by Ted  Capture creative commons

So, here we are at the end of the first two weeks of working collaboratively. It’s been an emotional roller coaster frustration and worry at our (very) late start and elation at finally making some decisions and seeing our website come to life

We are a team of 4 and are working collaboratively through a learning design process with the objective of producing a digital diary for reflective practice to support professional development. We have decided to focus on a diary to support ILM (Institute of Leadership and Management) accreditation.

In the first two weeks we have had to articulate the context, we are not quite there yet but it’s coming together and it is time to reflect on the process.  At this stage we have been tasked with answering three questions;

  1. Your contribution to the group effort of articulating the context.
  2. What you found challenging in this process.
  3. What you have learned from it.

So here goes…………

  1. Your contribution to the group effort of articulating the context.
    I prompted initial thoughts by suggesting 4 possible scenarios for our shared context – all workplace/vocational ranging from student placements to professional qualifications. They were fairly high level but I had stated the educational challenge clearly which became important when we later discussed and agreed our outline context.As project manager I have tried to keep the group on track and initiated and prompted action by
    – setting up the website
    – creating a Gantt chart for the group project
    – setting out agendas for meetings/discussions
    – writing up and distributing action points
    – creating joint documents in a shared work space  (Google Docs)

    I have supported my colleagues by being available to them online and on the telephone as we have worked through creating personas, establishing concerns and issues and distilling the relevant forces.

  2. What you have found challenging in this process?

    Time management
    We all have commitments and lead busy lives and so are available at different times. Using the communication channels we have set up has helped although I constantly feel there is something I should be doing, looking at commenting on.Managing my own emotions
    Panic
    It is very tempting to look at how the other groups are progressing..and then panic! To help manage these feelings I made the decision during this phase to stay out of other forums and threads and just dip in occasionally to get a sense of where others’ are going and whether we can learn anything from this (yes we can!)

    Worry
    We started late due to circumstance. At first I worried about this but I feel we have made great progress and our work is coming together.

  3. What have you learned from it?
    I feel we have a strong sense of our shared vision and are very clear who we are designing the digital diary for, this has been essential in keeping us on track.Creating personas has really helped to bring ‘our users’ to life for me as ‘real’ people. Whilst they are all different there are numerous similarities and I think this will help us with the core design of our solution. The differences highlight the real world differences we will experience and how we need to consider these within our design for it to be ‘fit for purpose’ They certainly reflect the different motivations e.g. career progression, accreditation, learning new skills. The unique qualities aspect of the persona is making me reflect on what our design will need to achieve for it to be accepted by our users.

    Reviewing colleagues personas has broadened my thinking about what our design has to ‘do’ and fulfill for our users.

    Having a design space where we work separately but together is useful in that we can present our thoughts without censorship, develop our own thinking as we review everyone’s contributions and the draw joint conclusions from this.

    I am learning to be patient and trust the process.

H817 Open University Weeks 14-15 Activity 8.