This blog is based on a paper by Bakharia et al. (2009) Social networks adapting pedagogical practice: SNAPP.
What can be revealed by a network diagram of students discussions?
identify disconnected and therefore at risk students
this is an early warning system and could be used to support students with low levels of engagement and to encourage engagement within the broader group by giving additional support. potential problems – the learner may not appreciate the tutor intervention and feel pressurised within the learning environment with the risk of further disconnection
identify key information brokers
individual learners could identify key people in their network and connect with them, ‘follow’ them and discuss issues and debate in the knowledge they will learn from interaction with a broader group of students potential problems – key information brokers may not broker the ‘effective’ debate and may be more vocal on issues other than those which are learning related
identify potentially high and low performing students so learning interventions can be planned
tutors can design appropriate interventions, allowing a tailored support approach. potential problems – learners may resent being ‘monitored’ and being selected for additional support.
indicate the extent to which a learning community is developing
harness the learning created in a community of practice and create deeper meaning for students potential problems – assumes the creation of a learning community is positive for everyone in the group and may influence peer and tutor perception of the commitment of learners who contribute least.
provide before and after snapshot of interactions occurring pre and post learning interventions
learning designers can ‘test’ whether the learning intervention met the learning objectives and learners can assess their progress potential problems – the number rather than the quality of interactions is revealed.
provide timely data for students to benchmark their performance and engagement against peers
learners can see how they are performing in relation to their peers and seek further support if required potential problems – the level of interaction rather than the quality of interaction is revealed, also this may impact on a learner’s confidence level with the result they stop contributing altogether.
Bakharia, A., Heathcote, E. and Dawson, S. (2009) ‘Social networks adapting pedagogical practice: SNAPP’ in Atkinson, R.J. and McBeath, C. (eds) Same Places, Different Spaces, Proceedings ascilite 2009, 26th Annual ascilite International Conference, Auckland 6–9 December 2009, Auckland, The University of Auckland, Auckland University of Technology and Australasian Society for Computers in Learning in Tertiary Education (ascilite); also available online at http://www.ascilite.org/conferences/auckland09/procs/bakharia-poster.pdf (Accessed 17th July 2016)
This activity applies checkpoint analytics and process analytics to the learning design of part of the H817 module at the Open University . I have chosen to review Block 1.
Getting to know each other
Innovation and eLearning
Where checkpoint analytics might be used
Setting up a blog
Setting up and using OU Live
Are OER both open and innovative –websites visited
Investigating a learning theory – group wiki completed and links visited
Choosing and evaluating a specific technology – group report production and links visited
Significant new technologies – creation of a table and links visited
Why checkpoint analytics might be used
Is the learner engaging with the resources and activities required within the module design.
Where process analytics might be used
How to document reflection
Reading an article and searching for relevant references
Application of learning through blog/forum debate and comments
Are OER both open and innovative – thoughts on questions posed posted to the TGF
Innovation in your context – application of concepts to own world described in TGF
A theory for elearning – stating reasons for agreement/disagreement in TGF
Investigating a learning theory working in a team to develop knowledge
Connectivism – a summary of personal analysis of connectivism completed
Choosing and evaluating a specific technology – group report content
Application of thinking about new technologies to learner’s own environment. Recorded in blog/TGF
Why process analytics might be used
Are the learning objectives of the module being realised?
Is the learner understanding the module material – evidenced through analysis, discussion and application to own environment
Who would benefit
Checkpoints analytics will help the Tutor ensure the learner is on track and prompt for progress if appropriate. There is also the opportunity early in this module to assess individual’s competence with technology and have insight into where additional support may be required later.
Process analytics initially help learners and the tutor get to know each other, allow the tutor to explore the ways in which learners are engaging and give insight into learner understanding.
Learners understand each others’ environments, values and perspectives very quickly through collaboration which set the scene for the rest of the module
The tutor has an early view on learner’s study patterns, values and perspectives
The tutor can prompt thinking and debate around topics to ensure learning objectives are being achieved.
I would prioritise checkpoint analytics at key points in this block
Introductions – to monitor early engagement and as an early warning sign that help may be needed.
Group Wiki created in week 3 – to monitor engagement in collaborative tasks and to inform group choice for week 4 activity
Group report created in week 4 – again to evaluate engagement (thinking ahead as early warning sign for Block 3) and to identify where the tutor may need to check in with individual learners
I would prioritise process analytics at the following points
Week 1 – are learners debating and commenting?
Week 2 – are learners applying their research and reading to their own environment?
Week 3 – do learners understand the key debates around learning theories?
Weeks 4 & 5- can learners evaluate the relevance of a specific technology
It strikes me that my process analytics priorities are focused on the assessment requirements which are tested in an assignment at the end of the block.
This blog is based on reading ‘Informing pedagogical action: aligning learning analytics with learning design’ (Lockyer et al., 2013)
The authors claim that data collected is underused or even unused due to a lack of an underlying framework. They propose a framework called checkpoint and process analytics. They argue that this framework can be applied to provide information on the impact of learning activities.
The student has accessed the relevant resources of the learning design e.g shown through log-ins and pages visited. Checkpoint analytics measure which files diagrams etc the learner has accessed (these are considered to be pre-requisites to the learning).
The value of checkpoint analytics lies in providing teachers with insight into whether learners are progressing through the planned learning sequence.
Some pedagogical actions
reports of student log-ins can be used to offer prompts for late starters
teacher can initiate action
student participation levels can be reviewed to see whether all are particpating in activities where they are all required to.
The student has processed the learning and applies information and knowledge e.g. shown through the tasks completed, forum postings and discussions. Process analytics measures whether the learner carries out the tasks they are expected to, using the provided learning resources to do this.
The value of process analytics lies in providing teachers with insight into engagement levels of individual learners, which networks they have built and therefore whether they have support structure or not. They also have value in determining the level of understanding.
Some pedagogical actions
ideas are shared and discussed – teacher can monitor the level of understanding
social network analysis allows identification of the effectiveness of each groups’ interaction process
I feel these categories are more useful than previously explored (data driven and pedagogical driven) This is a practical and pragmatic framework and feels more user friendly.
This week I’m looking at learning analytics in the library and have identified the following data collected by University libraries taking part in the Library Impact Data Project.
usage by discipline
number of items borrowed
items checked out from the library
number of library visits, measured via gate entries
number of hours in a year in which a student was logged into a library PC
hours logged into e-resources
total number of e-resources accessed
identification of most visited resources
pages visited and how long a student stays
pathways students take through the resources
demographics of users
course discipline and method of study
final degree results
social networks and activity
Five ways in which these data sets might support analytics that could lead to the improvement of learning and or teaching.
Identify the relationship between student attainment and library usage and potentially identify at risk students by analysing library usage. The teacher can then adjust their approach to at risk individuals.
To improve the quality and usefulness of eLearning resources through analysing the most visited resources and the length of time students spent on these pages.
Create more meaningful links between resources to promote the most useful and to lead students to other relevant resources – through analysing path ways students take.
Track the learner journey to design user friendly and engaging content.
Identify the knowledge gaps in individual students to highlight the need for academic intervention.
MAODE , H817, Open University, Block 4, Activity 10
The Innovating Pedagogy report, Sharples et al (2015) lists 10 types of pedagogy that may transform education in the future. Here I explore three of them and look at how they could be supported by learning analytics.
Crossover learning links educational content to those things that matter to learners in their everyday lives. It takes place in informal settings such as museums and field trips and these experiences are enriched by adding in knowledge from the classroom whilst educational experience is enriched by this everyday experience. These connected experiences spark further interest and motivation to learn.
Learners could be supported through analytics on users flow – quickly accessing paths they have previously followed. Teachers could be supported through being able to see where learners were active and how they related their informal surroundings back to classroom learning. This would enable individual follow-up and customisation.
Incidental learning is informal learning that takes place outside the formal classroom and is unplanned and unintentional. Learning analytics could support the capturing of this learning through analysis of websites visited, how long the visits lasted, which pages were viewed and what was downloaded. Social networking could be analysed to help highlight the breadth of social learning taking place and device use could be assessed to understand which devoces were being used most for informal learning. Ultimately, through analysis, this type of learning may become more ‘visible’ to the leaner and offer a broader picture of the whole learning experience.
Adaptive teaching recognises the unique qualities of learners and aims to offer a bespoke learning experience which engages each individual. Learning analytics could help educators offer a more bespoke learning experience rather than a one size fits all approach. Analysing learners’ previous and current learning creates a personalised path through educational content e.g. suggesting where to start new content and when to revisit and review previously viewed content. Learning analytics can also show and monitor progress. Educators could use this knowledge to ensure their students are on track and as an early warning sign for problems. It may also be possible to deduce which parts of the content are working and overall where students struggle and so can be harnessed to develop and improve the educational content and the overall learning experience.
Based on Learning analytics: drivers, developments and challenges (Ferguson, 2012)
Learning analytics is a new field that has emerged in the last decade with roots in business intelligence, web analytics, educational data mining and recommender systems.
The goals of what can be achieved and how these goals will be achieved still has to be defined.
Learning analytics are different from other related fields of academic analysis and Educational Data Mining (EDM)
There are a number of definitions of learning analytics. The current prevalent definition was set out in a call for papers for the first international Conference on Learning Analytics and Knowledge (LAK2011)
” Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs”
There are a number of factors driving the development of learning analytics
How business uses analytics to extract value from data sets (big data) to drive recommendation engines, identify patterns of behaviour and develop advertising campaigns
Widespread introduction of LMSs is creating larger data sets. This data is being collated but the reporting and visualisation of this has been largely non-existent.
Online learning take-up has increased
Increasing demand for educational institutions to measure and demonstrate improved performance
Emergence of different interest groups; government, educational institutes and teachers/learners
A bit of history
1979: Open University could reflect on 10 years monitoring the progress of distance students course by course
1999: it was slowly becoming clear that collaborative online learning could take place (Dillenbourg, 1999)
2000: EDM (Educational Data Mining) begins to emerge from the analysis of student-computer interaction with a strong emphasis on learning and teaching. In 2007 Romero and Ventura defined the goal of EDM as ‘turning learners into effective better learners’
2001: Second generation web opened up new ways of collecting web content from various sources, processing it and exchanging the results with other programmes (Berners-Lee et al., 2001)
In contrast the early use of the term learning analytics referred to business intelligence about e-learning (Mitchell and Costello, 2000)
2003 onwards: socially and pedagogical approaches to analytics began to emerge. Social Network Analysis (SNA) was a significant development. SNA is the investigation of networks and can be used to ‘investigate and promte collaborative and co-operative connections between learners, tutors and resources helping them to extend and develop their capabilities’
2008: Pedagogic theory starts to emerge strongly as an approach to optimising and understanding learning.
Political and economic drivers
Measuring of the quality of education to meet the challenge of declining education standards principally in the USA. ‘Academic analytics’ began to evolve to link data sets with improved educational decision making.
The field is rapidly expanding
In 2008 analytics and EDM split.
Analytical tools are rapidly developing and enabling different types of analysis e.g. LOCO-Analyst which provides feedback focused on the quality of the learning process.
With tools becoming more powerful ethics and privacy issues begin to emerge
In 2010 the field of analytics splits again with learning analytics gradually breaking away from academic analytics. Siemens presented the first early definition in 2010 which was refined and has become the current prevalent definition as described earlier in this blog.
EDM focused on the technical challenge
Learning analytics focused on the educational challenge (optimising opportunities for learning online)
Academic analytics focused on the political/educational challenge
Overlaps between them still remain though there have been further attempts to distinguish between them. Long and Siemens (2011)
In 2012 learning analytics were identified as a technology to watch in the NCM Horizon Report.
New tools such as GRAPPLE can now extract data from across an entire PLE
Learning analytics are distinguished by their concern for providing value to learners and employed to optimise both learning from and in the environments which it takes place
Ferguson, R. (2012) ‘Learning analytics: drivers, developments and challenges’, International Journal of Technology Enhanced Learning (IJTEL), vol. 4, nos. 5/6, pp. 304–17; also available online athttp://oro.open.ac.uk/ 36374/ (accessed 6 July 2016).
A review of definitions in Wikipedia shows how definitions of Learning analytics is developing.
Early definitions focused on Learning Analytics having a clear learner perspective, over time this seems to have developed into a perspective which places the educator at the core of Learning Analytics.
2010 Siemens: LA is the use of intelligent data, learner data and analysis models.
provide learners relevant content resources and social connections
predict learner success
perform necessary interventions
2011 Kozleski: LA is the use of intelligent data, learner data and analysis models
discover information and social connections for predicting and advising people’s learning
2012 Culatta: LA is the measurement, collection, analysis and reporting of data about learners and their contexts
Understanding and optimising learning and the environments in which it occurs
2012 – 2014 Definition remained same
July 2016: LA is the use of intelligent data, learner- produced data and analysis models
to discover information and social connections for predicting and advising people’s learning
Reviewing trends, challenges and technologies in the Higher Education sector over the next 5 years
Considering the above trends and applying them to my world
Three areas for adoption by my organisation
Data are already routinely collected in this organisation to enable commercial decision making and to understand consumer behaviour. Current learning solutions are aligned with corporate goals and link to personal development plans for each employee. A logical next step would be to extend the culture of data mining into the learning space.
Learning analytics is the “measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs,” (1st International Conference on Learning Analytics and Knowledge, 2011). Learning Analytics offers opportunities for both the learner and the organisation. It offers the organisation the ability to assess completion of learning, provides detailed data on interaction with the various learning solutions and so test their value and has the potential to enable more effective tailoring of learning to individuals.
Learning Analytics allow learners to take control of their own learning by measuring their progress, helps them develop their capacity to self-assess through feedback and develop their corporate knowledge through discussion and forum debates.
Issues to consider
Security, privacy and ethics
Defining measurements – needs to measure what matters
The majority of learning options provided in the organisation are face to face workshops on topics ranging from Project Management to soft skills development, such as giving and receiving feedback. Pre-work is required for each of these workshops but is limited to a reflective question on setting an objective for the participant’s own learning.
Flipped classroom is “a model of learning that rearranges how time is spent both in and out of class to shift the ownership of learning from the educators to the students.” (Johnson et al, 2016).
The organisation is currently moving away from a Parent Child culture and encouraging employees to take more ownership of their skills development. Flipped classroom provides the opportunity to support this and shift the emphasis of learning ownership from the organisation to the individual. Another advantage would be the more effective use of classroom time, enabling sharing of learning, ideas and best practice across the organisation rather than it being a ‘teach in’ or power point led workshop. There is currently a 20% non attendance rate. Personal upfront investment in learning may lead to greater attendance.
Issues to consider
Individual commitment required pre-workshop. How to manage attendees who have not completed their pre-learning
This organisation has a global and itinerant workforce. Current management learning solutions are face to face in the UK and are inaccessible for global employees (whilst this is not policy, the barriers to attending are immense).
Online learning will open out management training across the workforce and drive inclusion and openness which are stated as organisational goals.
Issues to consider
Replication of existing face to face offer is not appropriate for online. Requires redesign of current offer.