Access2OER:7 Case Studies
In requesting stories and solutions, some contributors provided more extensive analyses of initiatives to extend access to teaching and learning.
- 1 7.1 The "Connectivism and Connective Knowledge" solution
- 2 7.2 The RECOUP manual
- 3 7.3 The Global Grid for Learning
1 7.1 The "Connectivism and Connective Knowledge" solution
Stephen Downes wrote a comprehensive summary of the "Connectivism and Connective Knowledge" course - an experiment in open online teaching. This contribution fits into the present report particularly well, as it uses our classification of access issues, presented in Chapter 2, to analyse the accessibility of the course.
1.1 7.1.1 The context
"Connectivism and Connective Knowledge" was a course run by George Siemens and Stephen Downes in October/November 2008. It was offered through the University of Manitoba, Canada, as a credit course, but the course was also offered for free to any person interested. It came to be called the MOOC - Massive Open Online Course.
George Siemens and Stephen Downes acted as instructors. Logistical internet support was offered by the University of Manitoba, by Dave Cormier, and by Stephen Downes. Overall 24 students registered and paid fees to the University of Manitoba. Another 2,200 people signed up for the course as non-paying participants. All aspects of the course were offered to both paying and non-paying participants, with the exception that paying participants submitted assignments for grading and received course credit.
Participants registered from around the world, with an emphasis on the English- and Spanish-speaking world. The course was offered in English; Spanish participants translated key materials for their own use. The course attracted a wide range of participants, from college and university students to researchers, professors and corporate practitioners.
The course was designed to operate in a distributed environment and did not centralize on a single platform or technology. With the assistance of university staff and Dave Cormier, George Siemens and Stephen Downes set up the following course components:
- a wiki, in which the course outline and major links were provided;
- a blog, in which course announcements and updates were made;
- a Moodle installation, in which threaded discussions were held;
- an Elluminate environment, in which synchronous discussions were held;
- an aggregator and newsletter, in which student contributions were collected and distributed.
The instructors encouraged students to create their own course components, which would be linked with the course structure. Students contributed, among other things:
- three separate Second Life communities, two of which were in Spanish;
- 170 individual blogs, on platforms ranging from Blogger and edublogs, to WordPress and more;
- numerous concept maps and other diagrams;
- Wordle summaries;
- a Google group, with a separate group for registered participants.
1.2 7.1.2 Key barriers
- Access in terms of awareness: Given that the course attracted 2,200 people, the lack of awareness must have been addressed in some fashion! However, the course had not been widely advertised; it had been posted on George Siemens' and Stephen Downes' newsletters, which in turn are leading sources of information to a community that would be interested in the course.
- Access in terms of local policy/attitude: One of the major attractions was that the course was offered by the University of Manitoba. It was necessary to convince the university to offer an open course, which George Siemens managed by adding the enrolment component. In one sense, the paying students funded the non-paying students; in another sense, offering it as an open course created sufficient marketing to attract the paying students. The university was satisfied with this result and will be employing the same model again.
- Access in terms of languages: There was no multilingual access. However, because the instructors encouraged participants to create their own resources, they created the conditions that enabled a large, self-managed Spanish-language component to the course.
- Access in terms of relevance: The design of the course - as a distributed connectivist-model course - created a structure in which the content formed a cluster of resources around a subject area, rather than a linear set of materials that all students had to follow. Because participants were creating their own materials, in addition to the resources found and created by George Siemens and Stephen Downes, it became apparent in the first week that no participant could read or view all the materials. The instructors made it very clear that the expectation was that participants should sample the materials, selecting only those they found interesting and relevant, thereby creating a personal perspective on the materials that would inform their discussions.
- Access in terms of licensing: All course content and recordings were licensed as Creative Commons Attribution ShareAlike Noncommercial.
- Access in terms of file formats: The instructors did not try to provide access in all formats; rather, they employed a wide variety of formats for different materials and encouraged mash-ups, translations and other adaptations.
- Access in terms of infrastructure: Basic course material was provided in HTML and plain text, however, various course components reqired more bandwidth. The use of UStream proved useful to nobody, as the bandwidth requirements were too great even for the instructors. Skype worked well for planning and recording, but not for instructing. Elluminate was effective with limited bandwidth, but had a limit on the number of seats could be offered (it was capped at 200, although Elluminate said they would extend this as needed). All audio MP3 recordings were made available for download. Second Life was accessible only to those with the platform and sufficient bandwidth. Essentially, the structure of the course provided a wide range of access types, making it possible for people with limited infrastructure to participate, while still employing more intensive applications.
- Access in terms of discovery: A search tool was not provided; the major resource related to discovery had nothing to do with search. The provision of a daily newsletter to aggregate and distribute course content proved to be a vital link for participants. A steady enrolment of 1,870 persisted through the duration of the course. In evaluations and feedback participants said that the newsletter was their lifeline. A full set of archives was provided, allowing people to explore the material chronologically and make up days they had missed.
- Access in terms of ability and skills: One of the things that was noticed was that, by combining participants from a wide range of skill sets, people were able to - and did - help each other out. This ranged from people answering questions and providing examples in the discussion areas, to people commenting on and supporting each others' blogs, to those with more skills setting up resources and facilities, such as the translations and Second Life discussion areas.
1.3 7.1.3 Scalability and transferability
How might the solution "scale"?
The connectivist model employed in this course might offer a unique approach to the problem of scalability. The instructors could not provide everything that was needed for 2,200 students - nor did they try. Rather, they encouraged and created the conditions for participants to provide additional resources for themselves. The role of the instructors and facilitators is essential in this model, yet their role is not to provide solutions but rather to establish a basic structure.
Regarding marking and recognition, the course offered an insight that may prove useful in the future. While 24 students were graded by the University of Manitoba, the instructors received (and granted) a request for a student from another country to be assessed and graded by their own institution. All assignment descriptions were displayed as part of the open course, and the assessment metric was also distributed, so other institutions could know everything needed in order to provide evaluation and feedback.
What questions should we ask about this solution to add to our understanding of enabling access to knowledge and learning resources?
The main questions are in the area of applicability: would this model work in other areas? Would it work in other communities?
In addition, Stephen Downes is exploring the question of whether this approach could be supported with technology designed specifically for this model - for example, the creation of serialized feeds to automatically create and conduct cohorts through the course material.
Implications and adoption: what are the implications of this solution for OER and enabling access to knowledge and learning?
The course - which came to be known simply as CCK08 - was a landmark in open access because, while providing the formal requirements of open learning - course structure and content, recognition, assessment and credentials - it nonetheless operated on a very different model from other OER initiatives. Materials for the course were not "produced" in the traditional sense. Rather, the instructors created a framework, populated that framework with open materials already extant on the web, added some commentary and videos of their own, conducted open online sessions and recordings, and created the infrastructure for wide student participation.
- Course materials may be accessed from the course wiki
- Course blog
- Newsletter site (note that newsletter publication ceased with the end of the course)
- Some participant feeds
2 7.2 The RECOUP manual
This description of the RECOUP manual was provided by the present author.
2.1 7.2.1 About RECOUP and the manual
RECOUP is the "Research Consortium on Educational Outcomes and Poverty", based at the Faculty of Education, University of Cambridge, UK. The research undertaken by RECOUP examines the impact of education on the lives and livelihoods of people in developing countries, particularly those living in poorer areas and in poorer households. Its purpose is to generate new knowledge that will improve education and poverty reduction strategies in developing countries, through an enhanced recognition of education's actual and potential role.
RECOUP is a research partnership involving institutions of the South and North. The partnership brought together people from varied disciplines, where it became crucial to foster a shared understanding, not only of how to do research, but also around what is meant by research itself.
The "RECOUP manual" itself is an outcome of this partnership. Initially a manual was developed to support workshops that were organised in India, Kenya, Ghana and Pakistan. It became apparent that the manual would be very useful to help roll out further workshops and training in the required research skills. The lead authors (Nidhi Singal and Roger Jeffrey) decided to turn the manual in to an Open Educational Resource.
Nidhi and Roger write about the manual:
|Quote image||The spirit of dialogue, experimentation and a belief in the value of qualitative research that we developed during the process of refining the manual underpins our desire to share this work. We do not believe the process is over now that the manual is on the web: we hope everyone who reads and uses this material will tell us how it went, and engage with us and other users to adapt and improve it.|
2.2 7.2.2 How does the initiative address the access barriers previously discussed?
The RECOUP materials are accessible in terms of:
- language and culture - indeed, they have been developed specifically to bridge and connect research cultures);
- relevance: the content is highly relevant to the participants in the research consortium;
- licensing (Creative Commons);
- skills: at least within the consortium, the manual has become part of the training materials, and appropriate training can be provided.
There are two further aspects that merit attention.
OER for development
First, the process is (in the author's view) examplary in terms of OER in a development context. The OER was created because there was an identified need for training. The process involved good communication, and North-South partnership, leading to a resource that is appropriate and suitable for the intended areas. The researchers themselves decided that best impact would be achieved by opening up the resource, and making it available as widely as possible.
Formats and infrastructure
Second, there is a small addendum to the story concerning formats and infrastructure. The RECOUP site uses MediaWiki (like WikiEducator and Wikipedia) and, as such, it offers the same access features, including PDF printing. Also, all additional documents are available for download, bundled as zip files. However, the standard MediaWiki design ("MonoBook") is quite large (~130kB). The authors wanted the manual to be as low-bandwidth accessible as possible, so they produced an alternative low bandwidth version. Users comparing the low bandwidth version with the original site will notice that the low bandwidth site is faster, even on a good connection. Users on a slow connection will see a signficant improvement. (Compare also on a mobile phone: even with Opera mini, the low bandwidth site is faster.) The same technologies can be applied to any MediaWiki, such as WikiEducator or Wikipedia.
As a final twist, the computer hosting the low-bandwidth version does not need to have a special relation to the site itself. It can be located anywhere, for instance, on the local area network of a university. Pages that have been accessed once remain available, even if the internet connection fails temporarily. (Of course, pages can only stay up to date when there is internet access. As soon as the internet is restored, pages update automatically.) The technology is quite basic, but it would be quite feasible to develop it a little further, so that schools and universities have a local version of WikiEducator, Wikipedia, Medpedia, etc. always running, irrespective of whether the internet connection was working.
3 7.3 The Global Grid for Learning
This case study was provided by Theo Lynn from Dublin City University. Dublin City University is a partner in the Global Grid for Learning initiative with Cambridge University Press, the Cambridge University Centre for Applied Research in Educational Technologies (CARET), Arizona State University and Obeikan Research and Development. The Global Grid for Learning inititative (GGfL) is attempting to address many of the issues mentionned already by building a digital content pipeline to connect educators to a billion digital resources in the next ten years.
Regarding an earlier comment in the discussion, Theo notes that the "travel well" concept is a tough nut to crack. (The term "travels well" is used colloquially to describe resources that are are easy to use and re-use. See also notes on the Global Teacher Network OER Workshop, held in April 2009.) One of the ways GGfL is dealing with this "travel well" idea is to break down content into learning assets and structured learning objects instead of large aggregate units. The more granular the resource, the better it will "travel". GGfL also recognises the need to provide scaffolding to enable users to shape this content to suit their local needs.
GGfL has encountered three challenges:
- There needs to be a balance between commercial content, free but not open content, and open content, as well as the system, repository and enabling workflow process to distribute this in a device agnostic, bandwidth optimised way. To get to a billion resources, it is assumed 80-90% will be free or open resources. Ideally content needs to be local; unfortunately neither content nor metadata has been crosswalked for local context. Similarly, resources need to be culturally appropriate.
- Many countries worldwide want content but have no way of finding it. There needs to be a hosted discovery, exchange and delivery system. Commercial publishers need to be convinced to price on a micro-object level and to index pricing for the economic capacity of the target country.
- Even when content and systems are provided, teachers often do not have the capacity to teach using the content or systems, or to teach learners how to use them. Capacity to develop local content is also limited.
The GGfL project solution is to provide a central content repository and federated brokerage system, with common file and metadata standards, transcoding tools, etc. for commercial, free and open sources. To deal with free, as opposed to open, content GGfL has had to cater for two options for contributors - their own license or a Creative Commons license. To date, the focus is on attracting commercial publishers as they will be the hardest to get on board for competitive reasons. So far, GGfL has over four million resources. GGfL has built a web service that plugs into common platforms. Secondly, GGfL has nearly finished building a free, centrally hosted portal which has search, discovery, Google applications for education and some additional community features. GGfL hopes to extend this to include a Learning Management System (LMS) over time, although this has additional cost ramifications. GGfL is also putting together a free training programme (on searching, evaluating, downloading, modifying, describing and exposing content), and a twinning project to encourage educators to work together to build one piece of content. The site for free and open resources will be free. GGfL hopes to make a hosted LMS portal available to schools and colleges in developing countries by funding the project through twinning commercial licenses between schools and education systems in the developed and developing countries.
GGfL has developed a wide variety of tools to match materials across curriculum standards, editing for cultural appropriateness and exchanging content. The project has begun ambitiously in the US and Arab States nearly simultaneously. It hopes to expand once it is established in these initial regions.