Access2OER:OER exchange

From Bjoern Hassler's website
Jump to: navigation, search

The original OERWIKI seems to be offline (December 2012). The Access2OER discussion pages are preserved here for reference! The final report in pdf is available here: Access2OER_Report,


The report:
Contents
Introduction
Introduction to the report
Part 1 - Issues
What is access?
Issues and classification
SuperOER
Part 2 - Solutions
Solutions
Solutions criteria
Stories and solutions
Case studies
Part 3 - Proposals
Proposals
Conclusion
Conclusion and next steps
Appendix
Links
Blogs
Additional sections:
Introduction
Welcome
Invitation
Solutions
Stories 1
Stories 2
Stories 3
Case studies v1
Access initiatives v1
Proposals
OER Training proposal
Open Educational Resource Centres
OER exchange infrastructure
OER exchange infrastructure diagrams
Additional materials
Access2OER:Additional Considerations
HowTos
Index
Wiki only
Contents
Welcome
Invitation
Some technical notes
Discussion Log and Quotes:
Contents
Contents
Discussion Week 1
Issues
Classification
Comments on SuperOER
Overview of week 1 activities
Discussion Week 2
Discussion related to solutions put forward
Snippets from the general discussion
Overview of week 2 activities
Discussion Week 3
general discussion
OER training discussion
resource centre discussion
oer exchange discussion
stories discussion
All discussion on one page.
Additional pages
OER
Glossary
For authors:

[Edit][Edit][Edit]

Template:
[Edit]

[Edit]

1 An OER exchange infrastructure

1.1 What if?

What if any computer you bought (be it a netbook, desktop, server, or even just a harddrive/stick) could come preloaded with a free content collection? When you place the order, you choose from a large catalogue of OER/OCW/free content, and it comes preloaded on the computer, no further connectivity needed. However, when and where you have connectivity, your chosen content can automatically be updated and extended. (And content collections can be freely transferred and shared, installed from the net etc.)

What if it any computer could also be pre-ordered with a free content production suite? Not just OpenOffice, but a few other key applications as well, for instance enabling podcasting with audacity, enabling course development with moodle and educommons, plus a set of training materials?

What if the content content you create can then be contributed back to the global community, even where there is little or no connectivity?

1.2 A proposal for improving OER delivery to (and exchange with) regions that have poor connectivity

The second broad thread for proposals might be to improve "OER delivery and exchange mechanisms", including

  • Hybrid information delivery (N->S, S->N, S->S, N->N), seamless online/offline content delivery, caching
  • Suitable strategies for content packaging so that it can be delivered in this way.
  • Content transformation/transcoding (Might also include wiki content transformation options to cover wikipedia, wikieducator.)
  • Bandwidth management and bandwidth managed resource delivery
  • OER "accessibility" certification
  • Setting up a ("top level") domain structure (oer.org / oer.int): All content within this domain has to meet certain criteria (such as "CC" licensing, certain metadata and resource discovery, etc)

In terms of the classification of access issues, this proposal addresses:

  • Social, awareness, policy, attitude, cultural:
    • Access in terms of local policy / attitude. (Do attitudes or policies pose barriers to using OER?)
  • Technical: Provision of OER
    • Access in terms of file formats. (Are the file formats accessible?)
  • Technical: Receiving OER
    • Access in terms of discovery. (If the OER is hidden, not searchable, not indexed, it's hard to find.)
    • Access in terms of ability and skills. (Does the end user have the right skills to access?)
  • Technical: Access in terms of internet connectivity / bandwidth (Slow connections pose a barrier to access.)

For some diagrams, see Access2OER:OER exchange cartoon.

1.3 Miro and bittorrent

I'd like to hear what people think about the viability of technologies like bit torrent in countries with low bandwidth. I actually live in a rural area of Massachusetts in the US and my internet connection is spotty and hardly surpasses dial up speeds -- bit torrent has seemed like a good alternative for me. Files download in the background so that I can come back to it when the download is finished -- this makes it so that I have a delay in access but I have access nevertheless (miro is one example of bit torrent video distribution).

The software is Miro, see

1.4 Cascading download and exchange

"OER downloading and exchange system". That is to say "What if you could download OER materials easily, without bandwidth problems?"

There's a proposal right at the bottom of the email, so please do keep reading, and do put forward your thoughts, as to whether a solution like this would be desirable/helpful/would work, etc!

I'll start with some thoughts about Miro/bittorrent/podcasts:

1.4.1 Some thoughts about Miro/bittorrent/podcasts

  1. ABILITY TO DOWNLOAD SLOWLY. What is very valuable about the idea of using 'bittorrent' is the possibility to download, over longer periods of time, to get a resource. The same applies to podcasts: You can subscribe to a small file (the podcast rss file), and then wait for the resources to download.
  2. ABILITY TO RESUME. What's also valuable is that there may be an application (such as Miro, or the commercial iTunes, or another podcatcher/download manager), that gradually downloads te resource. Moreover, you can quit the program, and resume it some other time.
  3. NETWORK AWARENESS. What's not so good is that a typical podcatcher isn't aware of the network conditions: In an unmanaged network, it will take up whatever bandwidth it can. At the moment, many university networks are unusable, because of (illegal) file sharing activities. While downloading educational resource is of coure a better use of bandwidht, it would nevertheless make the network unusable for essential services like email, unless it was managed properly.
  4. "NETWORK PROXIMITY". Also, a typical podcatcher is now aware of the fact that somebody on my local network has just downloaded the same file. Comparing the notion of 'podcasting' with 'bittorrent'. For podcasting, only a 'downlink' is required. The 'bittorrent' (at least as a concept) also requires an 'uplink' which often is a lot smaller than the 'downlink' and could have a negative impact on the institution. However, the 'bittorrent' does have the advantage that in principle it knows about 'other copies' of the same file, that may be "nearby" in the sense that there is a good connection.
  5. FILE FORMATS. At the moment, podcatchers are 'multi-media' centric, which of course is why they were developed (i.e. for podcasts). At least iTunes only accepts a range of audi/video/pdf files, and nothing else. Download managers download anything (but are less aware of podcast feeds, though there may be hybrids.)

So in my view, what's needed is a 'cascading download' mechanism, that knows about 'nearby' files, and is aware of your bandwidth. Here's a potental scenario:

1.4.2 The Scenario

Suppose your in Zambia, and you use an application like Miro to subscribe to a feed, say the Participatory Culture Foundation podcast. Normally, the connection would be made straight to the PCF server, and would put immediate strain on your network, stopping others from browsing the web, or doing email. However, with our new and improved download system "super miro", the subscription doesn't go straight to the PCF server, but it goes to a local server at the school, then via a national Zambian school gateway (run by the NREN, providing an internet exchange point for Zambian schools and Universities), and only then goes to the PCF server.

Also, the 'network' talks back to your "super miro" application, so that "super miro" also doesn't take up the whole bandwidth available, but knows about available bandwidth, and restricts itself accordingly. The user then gets feedback:

This high-resolution video download will take a long time to download (approximately 18 hours). 
Would you like to download a low-resolution video preview (approximately 4 hours) or a low-resolution audio preview with some keyframes (1 hour) instead? 
Please choose:
[Get the high resolution video]    [Get the audio/image preview and notify me when the high res is ready]    [just get the low-res video preview]

So the user has the choice of getting lower resolution version (which needn't be provided by the PCF podcast itself, but could be generated elsewhere), with the option of waiting for the the high-res video as well. The user chooses audio/image preview, and has the file in an hour. When the user has watched this, "super miro" says: "A higher res version is available - do you wish to download it?" If the use proceeds, then in a day or so, they get an email, notifying them that the high-res file is available on their school server, for immediate viewing.

When the video has downloaded to the user, a copy is kept by the above chain of servers: The school server, the Zambian school gateway server, and perhaps another African internet exchange point outside Zambia. So others requesting the same file don't need to go back to the PCF podcast server to get the file. (However, every time a file is requested, from any of those servers, the PCF podcast server gets a 'ping' so that they have good statistics about how their media is being used.)

The same mechanism wouldn't just work for video/audio files, but would also work for Open CourseWare, for content packages, etc. For content packages (provided as zip files), "super miro" would be able to look inside the content package, and (just like the video file) the user would have the option of downloading lower bandwidth version of the materials first, before downloading the whole content package. (That is to say, the content package can be downloaded in pieces, and then be reassembled by "super miro" on the user side.)

The system would not just work online, but it would also be possible to "prime" the system with content packages downloaded elsewhere. So a Zambian school server would not need to be on the internet. Teachers would be able to request content packages by from the national Zambian school server, that would be mailed to them by DVD / memory stick / hard drive. Those content packages are installed on the Zambian school server, and become available to teachers "as if" the server was on the internet.

Also, the system is bi-directional: Content produced by the Zambia school is uploaded to their school server, but automatically mirrored to the national Zambian server, and perhaps to the server near the African internet exchange point. When somebody from the PCF wants to get a learning resource from the Zambian school, they don't put any strain on the Zambian school network, but the content just comes from a server near the African internet exchange point.

1.4.3 The proposal

So that outlines the scenario. Because this week is about proposals, here's the proposal:

  • Form a small consortium including key stakeholders (such as content providers, NRENs, and content users, ...)
  • Do some action research to see whether the above system would be usable and acceptable by schools and educational institutions in the developing world
  • Develop guidelines for content providers (how to make their resources automatically downloadable by "super miro")
  • Iteratively develop/deploy/test a "cascading download system", both on the server side, as well as for clients
  • The "cascading download system" may need to include on-the-fly content transcoding and transformation systems
  • Study the impact.

Of course the above proposal isn't completely taken out of thin air: Cliff's work with the eGranary, Theo's work with the Global Grid for Learning, as well as various ideas around the OCWC and OpenCast communities, already do at least some of this, but it would be good to try to tie some of this together globally.

1.5 Extensions

1.5.1 Mobile access to the major OER repositories via text-to-speech and/or telephony

This not a new idea but I think it is relevant and not yet implemented or championed.

The main idea is to take advantage of the deep penetration of mobile phones in the "developing" world and use them to access OER (or any online resource) starting with the most prominent repositories and platforms (such as Le Mill, WikiEducator, Wikiversity, Connexions, installations of EduCommons, Kewl/Chisimba, Sakai, OER Commons, etc.).

Scenarios:

SMS a "word" (or some other key phrase) to a number and the system reads back from the relevant OER repository using text to speech.

  1. Dial a number for a specific topic and:
    • Select language
      • Press 1 for Spanish, 2 for Portuguese, 3 for KiSwahili, etc.
    • Select topic
      • Press 1 for World Geography, 2 for ...
    • Select sub-topic ...
      • Press 1 for the introduction, 2 for learning activities, 3 for solutions, ...
      • ... etc. ...
  2. By SMS post a question about your mathematics/physics/whatever subject homework
    • and have hints (not answers) sent by SMS from one of a collection of mentors who see the questions on a web site and send out the hints.

These three scenarios could be implemented by extending the following projects:

  1. MobilED
  2. OpenPhone
  3. "Dr Math" - elaborated here

and I am sure there are others which could be integrated into the concept.

Technical Resources (mostly FLOSS):

  • For MobilED:
  • For OpenPhone:
    • Dialogpalette - web site (includes the Asterisk TTS modules developed for the OpenPhone/MobilED projects).
  • LWAZI - mobile access to public services in South Africa. Major project outputs likely to be usable in 2009.
  • UniWiki


1.6 Enabling communication

Would a platform benefit from integrated services, such as GGfL-like services (including Google Apps for Education), as well as Video Web Conferencing Technology, suhc VMukti.


1.7 Bjoern on collecting resources and tagging

If you look for instance at the MIT OCW pages, there is some metadata embedded in the page, which could easily also contain tagging, for instance with regard to curriculum. Likewise, there could be a visible tag on the page, that links to related resources.

Much of this depends on having a tagging system with relevant curriculum categories, which of course differ between different educational systems. This is a "know problem", and for instance the Global Grid for Learning is trying to address this through a curriculum mapping project.

There is also the question of how to offline a page that has been identified as a "desirable solution" - html 5 manifests can help here.

Regarding tagging pages: