Checkpoint analytics and process analytics

7649183772_a98c86b4b9_z

James Royal – Lawson: Flickr Capture creative commons

This activity applies checkpoint analytics and process analytics to the learning design of part of the H817 module at the Open University . I have chosen to review Block 1.

Getting to know each other Examining innovation Innovation and eLearning Evaluating innovation Horizon scanning
Where checkpoint analytics might be used

 

Introducing self

 

Setting up a blog

 

Setting up and using OU Live

Are OER both open and innovative –websites visited

Investigating a learning theory – group wiki completed and links visited Choosing and evaluating a specific technology – group report production and links visited Significant new technologies – creation of a table and links visited
Why checkpoint analytics might be used Is the learner engaging with the resources and activities required within the module design.
Where process analytics might be used

 

How to document reflection

Reading an article and searching for relevant references

Application of learning through  blog/forum debate and comments

Are OER both open and innovative – thoughts on questions posed posted to the TGF

Innovation in your context – application of concepts to own world described in TGF

A theory for elearning – stating reasons for agreement/disagreement in TGF

Investigating a learning theory working in a team to develop knowledge

Connectivism – a summary of personal analysis of connectivism completed

Choosing and evaluating a specific technology – group report content Application of thinking about new technologies to learner’s  own environment. Recorded in blog/TGF
Why process analytics might be used Are the learning objectives of the module being realised?

Is the learner understanding the module material – evidenced through analysis, discussion and application to own environment

Who would benefit Checkpoints analytics will help the Tutor ensure the learner is on track and prompt for progress if appropriate. There is also the opportunity early in this module to assess individual’s competence with technology and have insight into where additional support may be required later.

Process analytics initially help learners and the tutor get to know each other, allow the tutor to explore the ways in which learners are engaging and give insight into learner understanding.

Impact Learners understand each others’ environments, values and perspectives very quickly through collaboration which set the scene for the rest of the module

The tutor has an early view on learner’s study patterns, values and perspectives

The tutor can prompt thinking and debate around topics to ensure learning objectives are being achieved.

I would prioritise checkpoint analytics at key points in this block

  1. Introductions – to monitor early engagement and as an early warning sign that help may be needed.
  2. Group Wiki created in week 3 – to monitor engagement in collaborative tasks and to inform group choice for week 4 activity
  3. Group report created in week 4 – again to evaluate engagement (thinking ahead as early warning sign for Block 3) and to identify where the tutor may need to check in with individual learners

I would prioritise process analytics at the following points

  1. Week 1 – are learners debating and commenting?
  2. Week 2 – are learners applying their research and reading to their own environment?
  3. Week 3 – do learners understand the key debates around learning theories?
  4. Weeks 4 & 5- can learners evaluate the relevance of a specific technology

It strikes me that my process analytics priorities are focused on the  assessment requirements which are tested in an assignment at the end of the block.

Learning analytics and learning design

This blog is based on reading ‘Informing pedagogical action: aligning learning analytics with learning design’ (Lockyer et al., 2013)

The authors claim that data collected is underused or even unused due to a lack of an underlying framework. They propose a framework called checkpoint and process analytics. They argue that this framework can be applied to provide information on the impact of learning activities.

Checkpoint analytics

The student has accessed the relevant resources of the learning design e.g shown through log-ins and pages visited. Checkpoint analytics measure which files diagrams etc the learner has accessed (these are considered to be pre-requisites to the learning).

The value of checkpoint analytics lies in providing teachers with insight into whether learners are progressing through the planned learning sequence.

Some pedagogical actions

  • reports of student log-ins can be used to offer prompts for late starters
  • teacher can initiate action
  • student participation levels can be reviewed to see whether all are particpating in activities where they are all required to.

Process analytics

The student has processed the learning and applies information and knowledge e.g. shown through the tasks completed, forum postings and discussions. Process analytics measures whether the learner carries out the tasks they are expected to, using the provided learning resources to do this.

The value of process analytics lies in providing teachers with insight into engagement levels of individual learners, which networks they have built and therefore whether they have support structure or not. They also have value in determining the level of understanding.

Some pedagogical actions

  • ideas are shared and discussed – teacher can monitor the level of understanding
  • social network analysis allows identification of the effectiveness of each groups’ interaction process

I feel these categories are more useful than previously explored (data driven and pedagogical driven) This is a practical and pragmatic framework and feels  more user friendly.

References:

Lockyer, L., Heathcote, E. and Dawson, S. (2013) ‘Informing pedagogical action: aligning learning analytics with learning design’, American Behavioral Scientist, vol. 57, no. 10; also available online at http://libezproxy.open.ac.uk/ login?url=http://dx.doi.org/ 10.1177/ 0002764213479367 (accessed 14 July 2016).

MAODE, H817, Open University, Block 4 Activity 11

Learning analytics in the library

6875075891_6511a40b2a_z

Gord Webster Flickr Capture creative commons

H817 Week 22 Activity 10

This week I’m looking at learning analytics in the library and have identified the following data collected by University libraries taking part in the Library Impact Data Project.

Data collected:

  • usage by discipline
  • number of items borrowed
  • items checked out from the library
  • number of library visits, measured via gate entries
  • number of hours in a year in which a student was logged into a library PC
  • hours logged into e-resources
  • total number of e-resources accessed
  • identification of most visited resources
  • pages visited and how long a student stays
  • pathways students take through the resources
  • student attainment
  • demographics of users
  • course discipline and method of study
  • final degree results
  • social networks and activity

Five ways in which these data sets might support analytics that could lead to the improvement of learning and or teaching.

  1. Identify the relationship between student attainment and library usage and potentially identify at risk students by analysing library usage. The teacher can then adjust their approach to at risk individuals.
  2. To improve the quality and usefulness of eLearning resources through analysing the most visited resources and the length of time students spent on these pages.
  3. Create more meaningful links between resources to promote the most useful and to lead students to other relevant resources –  through analysing path ways students take.
  4. Track the learner journey to design user friendly and engaging content.
  5. Identify the knowledge gaps in individual students to highlight the need for academic intervention.

MAODE , H817, Open University, Block 4, Activity 10

Pedagogy driven analytics

6881756724_344b03432f_z
Ted Major. Dilemma

Capture creative commons

Analytics and pedagogy

The Innovating Pedagogy report, Sharples et al (2015) lists 10 types of pedagogy that may transform education in the future. Here I explore three of them and look at how they could be supported by learning analytics.

Crossover learning

Crossover learning links educational content to those things that matter to learners in their everyday lives. It takes place in informal settings such as museums and field trips and these experiences are enriched by adding in knowledge from the classroom whilst educational experience is enriched by this everyday experience. These connected experiences spark further interest and motivation to learn.

Learners could be supported through analytics on users flow – quickly accessing paths they have previously followed. Teachers could be supported through being able to see where learners were active and how they related their informal surroundings back to classroom learning. This would enable individual follow-up and customisation.

Incidental learning

Incidental learning is informal learning that takes place outside the formal classroom and is unplanned and unintentional. Learning analytics could support the capturing of this learning through analysis of websites visited, how long the visits lasted, which pages were viewed and what was downloaded. Social networking could be analysed to help highlight the breadth of social learning taking place and device use could be assessed to understand which devoces were being used most for informal learning. Ultimately, through analysis, this type of learning may become more ‘visible’ to the leaner and offer a broader picture of the whole learning experience.

Adaptive teaching

Adaptive teaching recognises the unique qualities of learners and aims to offer a bespoke learning experience which engages each individual. Learning analytics could help educators offer a more bespoke learning experience rather than a one size fits all approach. Analysing learners’ previous and current learning creates a personalised path through educational content e.g. suggesting where to start new content and when to revisit and review previously viewed content. Learning analytics can also show and monitor progress. Educators could use this knowledge to ensure their students are on track and as an early warning sign for problems. It may also be possible to deduce which parts of the content are working and overall where students struggle and so can be harnessed to develop and improve the educational content and the overall learning experience.

H817 Week 22 Activity 8: Analytics and Pedagogy

Google analytics – use in education

7454479488_9cf64433d6_z

joiseyshowaa. available here   Capture creative commons

H817 MA ODE Open University, Block 4 Week 22 Activity 7

On reviewing Google Analytics it becomes apparent that these data have potential to generate learning analytics.

Here are some examples

Active users: tracking learners to see that they are still active and to give early warning indicators of potential drop outs.

User explorer: would allow individual tracking through the learning experience.

Cohort Analysis: defining common learner characteristics such as first time students versus seasoned students would allow cohort behaviours to be compared.

Demographics: understanding the age and gender composition gives the opportunity to tailor content and interventions.

Geo (language and location): helps in understanding any potential difficulties experienced by non native language speakers.

Behaviour: measuring how often learners return to certain learning activities or links could give insight into what needs improvement or what learners preferences are.

Technology and Mobile: understanding how learners access content could be useful in ensuring design is fit for purpose and planning for the future.

User flow: understanding the path learners take through a site could enable designers to improve the learner experience

The challenges associated with this include

  • data protection of learners information and online behaviour
  • ethical issues around the gathering and application of some of the above data e.g. Geo and User explorer
  • the ability to gather data against every possible analytic may lead to data being gathered because it’s there rather than being gathered for a purpose
  • data can only show what has happened, skill is required in identifying trends for planning e.g. technology and mobile.

 

Introducing learning analytics

4038500333_94797049a9_z

Norquist. Photograph available here  Capture creative commons

Based on Learning analytics: drivers, developments and challenges (Ferguson, 2012)

Learning analytics is a new field that has emerged in the last decade with roots in business intelligence, web analytics, educational data mining and recommender systems.

The goals of what can be achieved and how these goals will be achieved still has to be defined.

Learning analytics are different from other related fields of academic analysis and Educational Data Mining (EDM)

There are a number of definitions of learning analytics. The current prevalent definition was set out in a call for papers for the first international Conference on Learning Analytics and Knowledge (LAK2011)

” Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs

Drivers

There are a number of factors driving the development of learning analytics

  • How business uses analytics to extract value from data sets (big data) to drive recommendation engines, identify patterns of behaviour and develop advertising campaigns
  • Widespread introduction of LMSs is creating larger data sets. This data is being collated but the reporting and visualisation of this has been largely non-existent.
  • Online learning take-up has increased
  • Increasing demand for educational institutions to measure and demonstrate improved performance
  • Emergence of different interest groups; government, educational institutes and teachers/learners

A bit of history

  • 1979: Open University could reflect on 10 years monitoring the progress of distance students course by course
  • 1999: it was slowly becoming clear that collaborative online learning could take place (Dillenbourg, 1999)
  • 2000: EDM (Educational Data Mining) begins to emerge from the analysis of student-computer interaction with a strong emphasis on learning and teaching. In 2007 Romero and Ventura defined the goal of EDM as ‘turning learners into effective better learners’
  • 2001: Second generation web opened up new ways of collecting web content from various sources, processing it and exchanging the results with other programmes (Berners-Lee et al., 2001)
  • In contrast the early use of the term learning analytics referred to business intelligence about e-learning (Mitchell and Costello, 2000)
  • 2003 onwards: socially and pedagogical approaches to analytics began to emerge. Social Network Analysis (SNA) was a significant development. SNA is the investigation of networks and can be used to ‘investigate and promte collaborative and co-operative connections between learners, tutors and resources helping them to extend and develop their capabilities’
  • 2008: Pedagogic theory starts to emerge strongly as an approach to optimising and understanding learning.

Political and economic drivers

Measuring of the quality of education to meet the challenge of declining education standards principally in the USA. ‘Academic analytics’ began to evolve to link data sets with improved educational decision making.

The field is rapidly expanding

In 2008 analytics and EDM split.

Analytical tools are rapidly developing and enabling different types of analysis e.g. LOCO-Analyst which provides feedback focused on the quality of the learning process.

With tools becoming more powerful ethics and privacy issues begin to emerge

In 2010 the field of analytics splits again with learning analytics gradually breaking away from academic analytics. Siemens presented the first early definition in 2010 which was refined and has become the current prevalent definition as described earlier in this blog.

Put simply

  • EDM focused on the technical challenge
  • Learning analytics focused on the educational challenge (optimising opportunities for learning online)
  • Academic analytics focused on the political/educational challenge

Overlaps between them still remain though there have been further attempts to distinguish between them. Long and Siemens (2011)

In 2012 learning analytics were identified as a technology to watch in the NCM Horizon Report.

New tools such as GRAPPLE can now extract data from across an entire PLE

Learning analytics are distinguished by their concern for providing value to learners and employed to optimise both learning from and in the environments which it takes place

Reference

Ferguson, R. (2012) ‘Learning analytics: drivers, developments and challenges’, International Journal of Technology Enhanced Learning (IJTEL), vol. 4, nos. 5/6, pp. 304–17; also available online athttp://oro.open.ac.uk/ 36374/ (accessed 6 July 2016).

 

 

 

Collaborative working – first reflections

Thinker

Thinker Mirror Reflection by Ted  Capture creative commons

So, here we are at the end of the first two weeks of working collaboratively. It’s been an emotional roller coaster frustration and worry at our (very) late start and elation at finally making some decisions and seeing our website come to life

We are a team of 4 and are working collaboratively through a learning design process with the objective of producing a digital diary for reflective practice to support professional development. We have decided to focus on a diary to support ILM (Institute of Leadership and Management) accreditation.

In the first two weeks we have had to articulate the context, we are not quite there yet but it’s coming together and it is time to reflect on the process.  At this stage we have been tasked with answering three questions;

  1. Your contribution to the group effort of articulating the context.
  2. What you found challenging in this process.
  3. What you have learned from it.

So here goes…………

  1. Your contribution to the group effort of articulating the context.
    I prompted initial thoughts by suggesting 4 possible scenarios for our shared context – all workplace/vocational ranging from student placements to professional qualifications. They were fairly high level but I had stated the educational challenge clearly which became important when we later discussed and agreed our outline context.As project manager I have tried to keep the group on track and initiated and prompted action by
    – setting up the website
    – creating a Gantt chart for the group project
    – setting out agendas for meetings/discussions
    – writing up and distributing action points
    – creating joint documents in a shared work space  (Google Docs)

    I have supported my colleagues by being available to them online and on the telephone as we have worked through creating personas, establishing concerns and issues and distilling the relevant forces.

  2. What you have found challenging in this process?

    Time management
    We all have commitments and lead busy lives and so are available at different times. Using the communication channels we have set up has helped although I constantly feel there is something I should be doing, looking at commenting on.Managing my own emotions
    Panic
    It is very tempting to look at how the other groups are progressing..and then panic! To help manage these feelings I made the decision during this phase to stay out of other forums and threads and just dip in occasionally to get a sense of where others’ are going and whether we can learn anything from this (yes we can!)

    Worry
    We started late due to circumstance. At first I worried about this but I feel we have made great progress and our work is coming together.

  3. What have you learned from it?
    I feel we have a strong sense of our shared vision and are very clear who we are designing the digital diary for, this has been essential in keeping us on track.Creating personas has really helped to bring ‘our users’ to life for me as ‘real’ people. Whilst they are all different there are numerous similarities and I think this will help us with the core design of our solution. The differences highlight the real world differences we will experience and how we need to consider these within our design for it to be ‘fit for purpose’ They certainly reflect the different motivations e.g. career progression, accreditation, learning new skills. The unique qualities aspect of the persona is making me reflect on what our design will need to achieve for it to be accepted by our users.

    Reviewing colleagues personas has broadened my thinking about what our design has to ‘do’ and fulfill for our users.

    Having a design space where we work separately but together is useful in that we can present our thoughts without censorship, develop our own thinking as we review everyone’s contributions and the draw joint conclusions from this.

    I am learning to be patient and trust the process.

H817 Open University Weeks 14-15 Activity 8.

 

A first attempt at using VideoScribe

MOOC VIdeoScribe

Here we go…”write about an element of open education you have found interesting” – hmmm I thought I’d have a go at working out loud again and so I created my first VideoScribe video. It’s very simple and took an age to make but I did it!

You can see it here

Working out loud…………….. for real

Firsts
Justin McGregor

So, I’ve talked about it enough and at the risk of winning the prevaricator of the year award I am reminding everyone that I talk about this …….alot!

Today is oficially a day of alisantics firsts

  • First Videoscribe ‘movie’
  • First voice over….do I really sound like that or is there a gremlin in the microphone?
  • First upload to YouTube
  • First share on WordPress
  • First real piece of Working out Loud!

Very simple………but as firsts go, very rewarding….after all I am very much ‘work in progress’

Block 2 Activity 25, H817

 

 

Blogging – half time reflections

Blog Capture

Jabiz Ralsdana Flickr.

I have enjoyed blogging far more than I had anticipated during the first half of the Open University Masters in Online and Distance Education module. I feel this is because I am producing something tangible with a purpose.

A notable development has been a lack of blogging when the module does not require it – this blog has been ‘silent’ for almost four weeks. Why is that?

  • I’ve been busy writing a 3000 word assignment – excuse or valid reason? We all lead busy lives
  • I deserved some time off after submitting my assignment – time off from what exactly? – do I see blogging as work/a chore?
  • My focus is on the next stage of the module which does not require blogging – do I need a reason to blog?

If I am to continue on my blogging development post this module these questions need serious consideration. John Stepper discusses creating habits and having regular blogging time each day.

I have to decide whether I see blogging as a fundamental part of my development and if so, make a conscious choice to create a blogging habit.

I am still very much work in progress and the jury is out on this one. Watch this space……..

Post script note: I have realised on re-reading this blog that I have an extremely inward focussed perspective here. Finding the image I finally selected to head up this blog has prompted me to consider I just might also be writing for existing friends and friends I have yet to meet….. More food for thought.