Form, function and content in the VLE

The other day I blogged about the gap between the theory of providing material via an institutional VLE from the perspective of an educational developer and the reality of doing so as I experienced it as an academic. My feeling was that most of us, (academics, that is, though I reiterate, by no means all academics), tend to see it as a content repository, and many students tend to regard it in the same light. Now as it happens, there has been a recent and very interesting debate about the purpose of VLEs on the ALT Jiscmail list. One of the points made there was that the VLE tends to shape our way of thinking about technology, and I think there is something in that. Of course there are many other tools out there besides VLEs, and I was quite impressed with this attempt to incorporate some of them into Blackboard, posted by a contributor to that debate.

http://wishfulthinkinginmedicaleducation.blogspot.co.uk/2010/03/prezi-workaround.html

However, for better or worse, Lincoln and many other institutions are likely to continue with some form of VLE for the foreseeable future, and as I said in the last post, I actually deconstructed a VLE site (Blackboard in this case) which had accumulated about five years worth of material. One of the first challenges in any kind of research, (and I maintain that this is a form of research), is analysis. So, bearing in mind Bourdieu’s warnings about the malleability of classes, and the way the field in which they operate tends to define them, here is a list of the classes of material I found. At first sight it reminded me a little of Borges’s Celestial Emporium of Benevolent Knowledge,  insofar as it has very little in common with recognised practice in the field of education.

  • Powerpoint slide sets used in lectures that are substitutes for lecture content
  • Powerpoint slides designed for use in class discussions
  • Word/Pdf Documents designed as handouts
  • Word/Pdf Documents which are drafts of articles
  • Word/Pdf Documents which contain downloaded articles
  • Web links to Open Source journal articles
  • Web links to journal articles on publishers sites that have copyright clearance
  • Web links to journal articles on publishers sites that do not have copyright clearance
  • Web links that are out of date
  • Web links that are broken
  • Web links that work
  • Blackboard wiki pages
  • Blog entries
  • Contributions to discussion groups
  • Audio recordings of lectures
  • Video clips
  • Video clips that no longer work
  • Administrative documents
  • Assignment submission instructions
  • Those that, at a distance, resemble flies.

(Oh all right, I just took that last one from the Celestial Emporium)

While it looks as though I have emphasised form and function over content here, that’s partly to make the point that form and function tend to dominate technological discourse. I did also give each item up to three subject based keywords, and the new site is in fact organised by topic because I thought that would be of more interest to the students.  But I thought the form listing was interesting too because inherent in it are quite a lot of assumptions about what is helpful for student learning. Yes, there’s a variety of forms, but is the same content available in each form? (No, of course not. Though it should be, if only to promote accessibility.)

Form is important in technology. Not everyone can access Word 2010 documents for example, and certainly not everyone has access to broadband sufficient to download video clips. What does the existence of broken, out of date and copyright infringing material, (which, let me reiterate, has all now been removed,) tell us about our attitude to providing this material? This is one site in one department in one University, but I’ll bet it’s not atypical. What I would really like to do is a set of multiple case studies of sites in different institutions and different disciplines. The purpose of doing so, as with any case study, is not to generalise, but to learn from what other people are doing and improve practice. Yes, sometimes that will involve being critical (constructively) of practice, but case studies can just as easily uncover excellent and otherwise hidden practice. While the last couple of posts may sound as though I’m very critical of the site as it was previously conceived, I do think it made a lot of good and useful material available to students. (They just couldn’t find it!).

As I say I think it would be really useful to do some research into this on a wider basis, but there’s an obvious methodological challenge. Since I don’t have access to sites anywhere other than Lincoln, if I ask for participants, there’s an obvious risk of only being given access to sites that participants want me to see. On the other hand extreme cases can be very informative in qualitative research. That’s a discussion for the research proposal though. On the basis of this case, I think though that there is an argument to be made that it is too easy for function to follow form, and for both of them to overshadow content in VLEs, and perhaps in e-learning generally.

What are VLEs for?

It occurred to me the other day that we I have been working with VLEs (Virtual Learning Environments) in one form or another for getting on for two decades now, and during those twenty years endless articles and books have been churned out on e-learning. I had been going to write something about how technology has transformed educational practice, but actually I don’t think it has, or not, so far, by very much. My role has, historically, been one of “supporting” academic colleagues with this technology, but it was only recently, when I became a programme leader, (with responsibility for my very own module) that I began to think about what kind of support would be useful to me. I’d be the first to admit that I am probably something of a special case. I know our VLE inside out and am very comfortable with the technology. I realise that not everyone shares that knowledge or comfort, so this is inevitably something of a personal take.

Nevertheless, it didn’t take many interactions with actual students to make me realise that the approach to e-learning we had been taking on the doctoral programme I studied, taught on, and am now leading, wasn’t really meeting their needs. (Come to think of it, as a student I hardly ever used the VLE myself). Let me say now, that this is not going to be a normative piece laying down the law about how VLE sites should be structured. I’m sure Lincoln’s doctoral students have their own unique set of needs, and these will be very different from say the needs of undergraduate students in other disciplines and at other universities. That said, to go back to the issue of support I started out with the idea that people needed to get a hold on how the technology works. I suppose they do, and in fairness, that was often the focus of requests for support. (Still is!) And that is what we, as educational developers have, by and large, provided, relying on the creativity of colleagues to do something clever with it. I suppose where we have fallen short is that we haven’t really built on that foundation. Having swapped my educational developer hat for an academic hat, I can see why. It’s really challenging to completely redesign a VLE site to match what the students say their needs are. At a programme board last year I reacted to student criticisms of what was provided for them on the VLE by blithely announcing that I would completely redesign it, thinking it would take a few weeks at most. It took six months, and detracted from quite a lot of things I was supposed to be doing, like, er research. Even now, even though the redesign has been launched, and seems to have been well received I’m acutely conscious that I’ve hardly begun to scratch the surface as far as things like learning activities for the students are concerned. Most of the work I have done so far is simply about providing a structure for the various teaching materials that I and other colleagues have provided, along with a little bit of cosmetic work on the menu and home page.

While I said I haven’t been doing research, I do think this exercise has given me the foundations of a theoretical framework for thinking about the contribution VLEs can make to a course. Clearly, if a VLE is to meet the needs of students, there has to be quite significant engagement with both the students and with the colleagues who are teaching on the programme. That’s not particularly original. Sharpe & Oliver (2007) make much the same point. Secondly, I think there is a need to think about what sort of contribution the VLE can make to students’ learning. Clearly, the best VLE in the world is no substitute for the University library. Yet, in the exercise I have just completed I counted around 400 “learning items” which had been generated over the last five years. These included PowerPoint slides, Prezis, and handouts from teaching sessions and guest lectures, podcasts, videos, and quite a few journal articles that (ahem) didn’t appear to have appropriate copyright clearance. (Those have all been removed now.) On top of those there was a whole range of what might be called regulatory documents such as programme handbooks, ethical approval forms and assignment submission sheets. Clearly that’s a significant and useful resource, but on its own it’s not anything like adequate for doctoral, or even, some would argue undergraduate, study. Even having imposed some sort of structure on all this material, which is really all I have done in the redesign, I’m still not sure where to go next. What learning activities are appropriate? Why? How do I design them? Do I limit myself what the technology offers? (A fairly obvious danger in simply “training” colleagues to use the technology)

So this raises the question, what exactly is a VLE for? Maybe that’s better phrased as “what is it not for?” Students, at least in surveys at Lincoln have often said that they want “consistency” in the way staff use the VLE. Well, yes, but I think there has to be a general agreement about what we can reasonably expect of a VLE. There is clearly a tension between this desire to meet students’ legitimate expectations and the kind of academic freedom that these technologies allow. It doesn’t seem reasonable to me to expect e-learning to take the same form in, for example, modern dance that you would find in chemical engineering. Equally, it could be argued that providing students with material through the VLE detracts from the important skill of literature searching, whether that’s done in a library or through a Google search. Even more importantly, providing them with “all the resources they need”, even if it were possible, is unlikely to encourage them to develop a critical engagement with the literature.

Where does that leave us then? After nearly 20 years of using VLEs have we just ended up with an expensive, badly organised repository of content of dubious value? In some cases undoubtedly, though it would be quite wrong to think that all VLE sites fell into that category. There is some excellent work out there. I’ve been to plenty of conferences where I’ve seen good, innovative and creative practice, and I know from my support role that many colleagues at Lincoln are pushing the boundaries in quite imaginative ways. The challenge is to spread this kind of practice, bearing in mind that such innovation is risky even if the major risk is that academic staff devote more time to their students than to their research. (After all you might not get that grant bid in, or that journal article submitted, and since the teaching grant disappeared in the humanities and social sciences that is by no means a small risk). I do think though that there is a case for more detailed research into what academics actually do in terms of course design with a VLE. But that’s for another post.

Reference
Sharpe, R & Oliver, M (2007) Designing courses for e-learning in Sharpe, R & Beetham, H. (eds). Rethinking pedagogy for a digital age: Designing and delivering e-learning. – Routledge, London (pp41-51)
.

Making a video with PowerPoint

This is by way of a bit of self development. The university library has introduced a new piece of software called Talis Aspire, designed to make reading lists a little easier, and I was wondering how best to introduce it to colleagues who are less than enthusiastic users of technology. I’ve also been thinking for a while that it ought to be possible to make reasonable quality videos using simple tools – in this case PowerPoint 2010.

This is very much a first attempt. I realise the text is very small, and there’s no sound at this stage because I wanted to keep the file size low, and anyway, I didn’t have a lot of time. Depending on how well recieved this is, I may well develop a more accessible version later on. (Any volunteer voice actors out there with a few minutes spare time? I envisage a male/female conversation, but it’s not essential). Anyway, here’s the “proof of concept”

Electronic Submission of Assignments part 2

Manchester Postcard by Postcard Farm (Flickr – “http://www.flickr.com/photos/postcard-farm/5102395781/” )

As promised here’s part 2 of my report on the E-submission event held at Manchester Metropolitan University last Friday.

The presentations from the event are available here; – http://lncn.eu/cpx5 

First up was Neil Ringan, from the host university talking about their JISC funded TRAFFIC project. (More details can be found at http://lrt.mmu.ac.uk/traffic/ ) This project isn’t specifically about e-submission, but more concerned with enhancing the quality of assessment and feedback generally across the institution. To this end they have developed a generic end to end 8 stage assignment lifecycle, starting with the specification of an assessment, which is relatively unproblematic, since there is a centralised quality system describing learning outcomes, module descriptions, and appropriate deadlines. From that point on though, practice is by no means consistent. Stages 2-5; Different practices can be seen in setting assignments, supporting students in doing them, methods of submission, marking and production of feedback. Only at stage 6, the actual recording of grades, which is done in a centralised student record system does consistency return. Again we return to a fairly chaotic range of practices in stage 7, the way grades and feedback is returned to student. The Traffic project team describe stage 8 as the “Ongoing student reflection on feedback and grades”. In the light of debating whether to adopt e-submission, I’m not sure that this really is part of the assessment process from the institution’s perspective. Obviously, it is from the students’ perspective.  I can’t speak for other institutions, but this cycle doesn’t sound a million miles away from the situation at Lincoln.

For me, there’s a 9th stage too, which doesn’t seem to be present in Manchester’s model, which is what you might call the “quality box” stage. (Perhaps it’s not present because it doesn’t fit in the idea of an “assessment cycle”!) I suppose it is easy enough to leave everything in the VLE’s database, but selections for external moderation and quality evaluation will have to be made at some point. External examiners are unlikely to regard being asked to make the selections themselves with equanimity, although I suppose it is possible some might want to see everything that the students had written. Also of course how accessible are records in a VLE 5 years after a student has left? How easy is it ten years after they have left? At what point are universities free to delete a student’s work from their record? I did raise this in the questions, but nobody really seemed to have an answer.

Anyway, I’m drifting away from what was actually said. Neil made a fairly obvious point (which hadn’t occurred to me, up to that point) that the form of feedback you want to give determines the form of submission. It follows from that that maybe e-submission is inappropriate in some circumstances, such as the practice of “crits” used in architecture schools. At the very least you have to make allowances for different, but entirely valid practices. This gets us back to the administrators, managers and students versus academics debate I referred to in the last post. There is little doubt that providing eFeedback does much to promote transparency to students and highlights different academic practices across an institution. You can see how that might cause tensions between students who are getting e-feedback and those who are not and thus have both negative and positive influences on an institutions National Student Survey results.

Neil also noted that the importance of business intelligence about assessments is often underestimated. We often record marks and performance, but we don’t evaluate when assessments are set? How long are students given to complete? When do deadlines occur? (After all if they cluster around Easter and Christmas, aren’t we making a rod for our own back?) If we did evaluate this sort of thing, we might have a much better picture of the whole range of assessment practices.

Anyway, next up was Matt Newcombe, from the University of Exeter to tell us about a Moodle plugin they were developing for e-assessment More detail is available at http://as.exeter.ac.uk/support/educationenhancementprojects/current_projects/ocme/

Matt’s main point was that staff at Exeter were strongly wedded to paper-based marking arguing that it offered them more flexibility. So the system needed to be attractive to a lot of people. To be honest, I wasn’t sure that the tool offered much more than the Blackboard Gradebook already offers, but as I have little experience of Moodle, I’m not really in a position to know what the basic offering in Moodle is like.

Some of the features Matt mentioned were offline marking, and support for second moderators, which while a little basic, are already there in Blackboard. One feature that did sound helpful was that personal tutors could access the tool and pull up all of a student’s past work and the feedback and grades that they had received for it. Again that’s something you could, theoretically anyway, do in Blackboard if the personal tutors were enrolled on their sites (Note to self – should we consider enrolling personal tutors on all their tutees Blackboard sites?).

Exeter had also built in a way to provide generic feedback into their tool, although I have my doubts about the value of what could be rather impersonal feedback. I stress this is a personal view, but I don’t think sticking what is effectively the electronic equivalent of a rubber stamp on a student’s work is terribly constructive or helpful to the student, although I can see that it might save time. I’ve never used the Turnitn rubrics for example, for that reason. Matt did note that they had used the Turnitin API to simplify e-marking, although he admitted it had been a lot of work to get it to work.

Oh dear. That all sounds a bit negative about Exeter’s project. I don’t mean to be critical at all. It’s just that it is a little outside my experience. There were some very useful and interesting insights in the presentation. I particularly liked the notion of filming the marking process which they did in order to evaluate the process. (I wonder how academics reacted to that!)

All in all a very worthwhile day, even if it did mean braving the Mancunian rain (yes, I did get wet!). A few other points were made that I thought worth recording though haven’t worked them in to the posts yet.

• What do academics do with assignment feedback they give to theire current cohort? Do they pass on info to colleagues teaching next? Does anybody ever ask academics what they do with the feedback they write? We’re always asking students!
• “e-submission the most complex project you can embark on” (Gulps nervously)
• It’s quite likely that the HEA SIG (Special Interest Group) is going to be reinvigorated soon. We should joint it if it is.
• If there is any consistent message from today so far, it is “Students absolutely love e-assessment”

Finally, as always I welcome comments, (if anyone reads this!) and while I don’t normally put personal information on my blog, I have to go into hospital for a couple of days next week, so please don’t worry if your comments don’t appear immediately. I’ll get round to moderating them as soon as I can

Blackboard 2012+

Here’s a little more from Wednesday’s Midlands Blackboard User Group meeting. As always, Blackboard representatives were present at the meeting to tell us all about their plans for the future, and this proved an interesting session. As a company they have always had a strategy of absorbing, or partnering with other tools, but I was quite surprised to hear that they are talking about integrating with Pebble Pad, which I’ve always felt was the best of the various e-portfolio tools available. My surprise arises from the fact that Pebble pad is a product that has always had a completely different look and feel from Blackboard, although I haven’t seen the latest versions as yet. Colleagues from other institutions tell me they’re very impressive though.  They also mentioned a potential partnership with a company called Kaltura, who offer a video management tool which has apparently enjoyed some success in HE.

 

The new look interface

On to more mundane matters, we were told about the upcoming service packs. Service packs are not new versions of the software, but are either enhancements to the existing functionality of the software, or fixes for the enhancements that didn’t quite work in the previous version. The latest such pack is, according to Blackboard, the first one to be wholly based on customer feedback, and offers a number of significant improvements to the user interface. Most importantly they have finally got rid of the module boxes on the front page in this pack, replacing them with a menu that appears when you roll your cursor over the word “courses”. There’s a “recently visited” list of modules too. Instructors can change colour schemes within modules, although there is a limited range of schemes. Finally they have introduced what they call course relationships – this simply means that it is possible to arrange menus hierarchically, so for example, you could show modules as part of a programme.

 

In terms of assessment, they have introduced negative and weighted marking into the on-line quizzes. Although there’s some debate about this, my own feeling is that the ability to penalise students for guessing makes a multiple choice test a much more powerful assessment tool. There’s also a regrade the whole test feature. Previously, if an instructor made an error and indicated an incorrect answer as correct, then they had to regrade each students attempt. Now this can be done in a single operation.

 

The final enhancement in the latest service pack is what is called “standards alignment” It is possible to align course and even specific learning outcomes with sets of standards, and you can also find content that is associated with particular standards, a feature that looks as though it may be particularly useful in course validations, as we’ll be able to test how any proposed module meets the strategic requirements of the university (In theory. Of course, it would be necessary to convert those requirements into standards first)

 

For techies, Out of box shibboleth and CAS authentication will be provided and it is now possible to use whatever version of Apache you want. Finally, “Blackboard Analytics” will be released in June, which offers very detailed statistical reporting, although this is a new product, not part of a service pack.

 

This last paragraph looks at medium term plans, although the Blackboard representative did admit that most of this stuff was as yet, only half written. However it does sound quite exciting. Plans for later service packs (2013) include a much greater social presence for Blackboard, including a student learning space, where they can interact with friends (I detect the influence of Facebook here), Something else that is long overdue is a shareable learning object repository which will be added to the content store. They’ll also be upgrading the test canvas again, to make some of the question tools a little bit more user friendly. The survey tool will also be enhanced so that it will be possible to deploy surveys outside Blackboard, which may be of some interest to researchers. Analysis tools will be provided within Blackboard.  Finally, and again something that should have been done years ago, it sounds as though it will be possible to integrate the Blackboard Calendar with web based calendars (although they didn’t mention Outlook).

Blackboard Midlands User Group Meeting, University of Northampton.

University of Northampton

Yesterday, I attended the Midlands Blackboard User group meeting at the University of Northampton, and I thought I’d regale with you with an account of the proceedings.  The User Group meetings are always valuable, but this one proved to be particularly helpful, as many of the institutions represented had reviewed, or like us, were in the process of reviewing their virtual learning environment. There was a lot for us to learn from. Now, I’ll admit from the start that a Blackboard users group is likely to have something of a pro-Blackboard bias, but the key phrase here is “user group”. There is nothing that users like more than kicking a little corporate ass, and I’ve been to lots of meetings at both regional and national level where that sport has been gleefully indulged in.

I’m not going to name names on a public blog, but will be happy to share the details of evidence provided at the meeting with our review team. (Also, some of this is taken from private conversations outside the meeting, so it would be doubly inappropriate to share identities). Finally, before I get going, this is a report, not a personal position on my part, nor should it be taken as being indicative of any future decisions that the University of Lincoln may take.

One institution had undertaken a very thorough comparison of Blackboard and Moodle, after shortlisting them from a larger number of a VLEs. They started from the position as the delegate entertainingly put it with the PVC in charge sticking their fingers in their ears and shouting “La La MOODLE La La” (or words to that effect) whenever anyone mentioned any other VLE. (I suspect there may be a touch of dramatic license there, but it does seem to be a syndrome that is widely reported at these events). Anyway the review team, having completed their comparison announced that they were less than impressed with Moodle.

One concern was that it was they felt it would be very difficult to upgrade. If, for example, you want to move from version 1.9 to the latest version (2.3) you not only have to download and install the new version, you have to rewrite all the patches you made to make 1.9 match your business processes. (And pay staff to do it) Of course you can pay a consultant to do that, but smaller software houses are sometimes quite literally just that, consisting more or less of one person working from their own home, so there’s little reliability. Finally you could wait for the Open Source community to deliver the patch but that may be a risky strategy with a mission critical system.

Another major concern from this institution was that in spite of claims to the contrary, it proved almost impossible to seamlessly transfer Blackboard content into Moodle. (They set up two servers, to compare the latest versions of Blackboard and Moodle). Also, they discovered that of all the VLEs they looked at,  the only one that produced IMS compliant packages was Blackboard. I was quite surprised at that, but to be absolutely frank, I haven’t thrown myself into the exciting world of metadata standards as deeply as I probably should have done, so didn’t ask for more details.

Not specifically related to the qualities of Blackboard, Moodle or any other VLE, but still something of an issue, was the reluctance of academics to get involved in the review. There was definitely a sense of “Why should we engage with this, because we’ll only have to do it again  when we move to Moodle”. In the end this institution decided to renew their contract with Blackboard. As a matter of interest they also costed the review, which they said came out at about 300 hours of academic staff time, plus the cost of running all the focus groups. Reviewing a VLE it seems, is not a cheap exercise. They also concluded that any cost savings that may be made by moving would be more than cancelled out by the support and maintenance costs.

"Moodle is free like a puppy is free"

 

The delegate summed this up with a memorable quote from a VLE manager who uses Moodle herself. “Moodle is free like a puppy is free”. You might get it for nothing, but you’ve got to feed it, house train it, exercise it and pay the vet’s bills when it gets ill.  And it won’t even fetch a stick!

(Image credit – http://www.flickr.com/photos/shekum/3415361722/)

.

 

 

Most other institutions gave less vivid accounts, but at least one major Midlands university had also decided to stick with Blackboard, but had decided to shift from hosting it themselves to letting Blackboard host it. Their principal reason for doing so was “Terrible service from our central IT services”.This delegated related tales of servers failing and not being rebooted, (resulting in no Blackboard service) and noted that “Blackboard are quicker to tell me there’s a problem than our own IT services”. Central IT support services in general seemed to attract some opprobrium from the delegates, so perhaps we should be grateful that Lincoln’s IT team are on the whole very responsive.

Much of the other reports focussed on attempts to integrate Blackboard with Student Record and Programmes Data systems. This is something the Blackboard representatives said they were keen to develop, and I’ll report on the latest version of the Blackboard Road Map in a future post. There was quite a lot more at the meeting, not all VLE related so I’ll need at least a couple more posts to tell the full story.

 

Threshold Standards for the VLE?

Durham Conference blog post, part 3

As promised, here’s the last blog post from Durham, but in some ways the most controversial. There was a panel debate at the end of the first day on whether institutions should impose a minimum standard on VLE provision. To put it in Lincoln terms do students have a right to expect that certain things should be provided on Blackboard?  This is an issue that raises its head from time to time, and on the face of it one might think it was uncontroversial. (Students are paying fees, are they not? Why shouldn’t they expect some material to be provided through what is, effectively, the major student learning portal.).

For me though these things aren’t quite so simple. I do accept that students do have a right to a basic institutional information set, but there’s a debate to be had about what it should contain. I’m a lot less comfortable with the notion that every module, across all disciplines and at both undergraduate and postgraduate level should be denied the freedom to use a technology in whatever way those teaching the module think most appropriate. My second objection to a minimum standard of teaching information is that it is very likely to be highly didactic effectively saying “This is what you must do to pass this module.”Lincoln’s strategy is to cast the student as a producer of their own learning. While that clearly involves providing students with spaces to learn in, and access to resources, whether they be text based, digital, or specialised equipment, it also involves providing the opportunity to make, show, and perhaps most importantly of all discuss their work. I’m not sure that VLE’s are really set up for that as I said in a post a few weeks ago. Not yet, anyway.

Anyway, that’s enough about my views – how did the debate go.?  Well, right at the beginning, we had a vote on whether we should have a minimum standard, and the results were

First vote
Results at beginning of session

YES – 56%

NO – 17%

DON’T KNOW – 23%

DON’T UNDERSTAND THE QUESTION – 5%

 

 

 

 

 

(Actually, the preferred term is threshold standard rather than minimum standards, the idea being that users will progress beyond the threshold, rather than work to the minimum).

In some respects this debate is a reflection of the success of the VLE. Many of the early adopters were seen as being rather adventurous, pushing the boundaries of what could be done with technology. Nowadays though, VLEs , and learning technology are commonplace, and while I don’t want to over – generalise, students are generally much more familiar with learning technologies, which implies that there would be a demand for technology based learning even if fees had not been introduced. The environment they grew up in and are familiar with happens to be technology rich. Certainly, as one of the panellists suggested, it’s a good idea to try and look at the VLE through students’ eyes. I haven’t conducted any sort of survey into this, but I strongly suspect that most educational developers prefer to see themselves as having a quality enhancement role, rather than a quality assurance role. Enhancement, to be effective, must involve the views of the users, which takes us back to the Student as Producer strategy.

Some contributors suggested that the commonest complaint from students were not so much about content, but about inconsistencies in design and structure. That, as one panellist pointed out was a real problem for joint honours students. The general feeling of the meeting was that this is best solved by involving students in the design but at a course or departmental level, rather than an institutional level, which would go some way to alleviating my objection that courses in say Fine Art, are profoundly different from courses in Computer Science and trying to impose a universal standard on both would be counter productive. (Although that still wouldn’t really help joint honours students)  It was suggested that departments could produce mutually acceptable templates for their Blackboard sites, which is a start, but still runs the risk of empty content areas. I’m not sure that’s a major issue. While we don’t mandate what staff do with their Blackboard sites at Lincoln, we do have a standard template for new sites, which staff are free to change. My feeling is that, while I have some reservations about the didactic nature of the template, it does work quite well, although I do think there’s scope for a small piece of internal research assessing how often colleagues depart from the template, or if they don’t which buttons are most used.

One audience member asked about standards in other technologies. I’m not sure that, other than computer use regulations, which are really about ensuring that an institution complies with legal requirements, they are that common. We don’t really mandate what colleagues can say in e-mail, or even what format emails should be sent in. Even if we did, we couldn’t enforce it, which is of course an issue for VLE provision too. The only real sanction is that poorly designed content posted on a VLE is likely to stay around much longer than a poorly delivered lecture, and be visible to colleagues) which ought to be an incentive for colleagues to concentrate on ensuring that such material was of the best possible quality.

A final objection to a threshold standard is that it requires a certain standard of competence from the users of the technology. University lecturers are primarily employed for their disciplinary expertise, and to a lesser extent for their pedagogical skill. Technological skill comes (at best) third, although you might argue that, in the current highly technological environment, digital literacy is as essential as, well, literacy. My own view is that most people’s digital literacy is pretty much adequate, although there are a minority who will always prefer to get someone else (usually an admin assistant) to post material on the VLE. That I think is where minimum and threshold standards have the potential to cause recruitment problems. As an institution we’d have to decide what were essential skills for working with technology, and ensure that we find people who had sufficient disciplinary, pedagogical and technological skills.

Interestingly when the vote was run again at the end of the session, the results were

 

Vote at the end of the conference
Vote at the end of the session

YES – 43%

NO – 43%

DON’T KNOW – 14%

DON’T UNDERSTAND THE QUESTION – 0%

 

Which if nothing else, indicates that debating a topic improves understanding. At the end, everybody understood the question. More seriously, the debate was an excellent illustration of the problems associated with imposing standards on a highly diverse community. They’re a good idea until you have to conform to them yourself.

 

One last thing – there’s a much better summary of the debate available provided  by Matt Cornock, to whom many thanks.

All that remains for me to do is to thank the Learning Technologies team at Durham for organising an excellent conference, (which they always do!) and to recommend the conference to colleagues for next year. It’s always a good mix of academics and educational developers, and you get to see some really interesting practice from around the sector. I’ve been for the last four years now, and while I’m more than happy to keep my attendance record up, I’m beginning to feel a bit selfish about hogging it.

 

 

 

 

Trojan Horses and Openness.

Durham Blackboard User’s Conference 

 

Durham Cathedral and Castle
Durham Cathedral and Castle

In what has become something of a new year ritual for me, I took myself off to the wonderful little city of Durham (A bit likeLincoln, but with a much more impressive river to set off the cathedral and castle!)  The conference itself is, effectively, the annual meeting of UK Blackboard users and is a mix of keynotes, practitioner presentations, debates and presentations from the Blackboard executive about their plans for the future. And delegates get to eat the conference dinner in the Great Hall of the castle.

This is the first of two posts. In this one I’ll try and summarise Blackboard’s plans for the future, and the two (excellent) keynote presentations by Grainne Conole and Ray Land.  In the next, I’ll write up the practitioner presentations, and what proved to be a very interesting debate about whether institutions should try and specify a minimum threshold for content on Virtual Learning Environments

 

Blackboard’s plans for the future.

 

There will be a new look and feel to Blackboard released in February, although this won’t affect any of the existing functionality. From the demonstration, it did appear to have a much more modern aspect to it. Additionally, users will have much more ability to customise Blackboard to their own taste (or lack thereof!) Later in the year , although they were rather vague about exactly when this will happen, they plan to release an upgrade to the on-line submission process, which will allow instructors to mark student work using Microsoft Word’s track changes feature –. It will be possible to save the file, with the comments made, and then release that to the students as feedback. Lincoln users probably won’t see these changes until September at the earliest, since any upgrade needs to be thoroughly tested behind the scenes, to ensure that the upgrade doesn’t break any current services. There was also some talk of a new analytics product being released soon, which would give us much better information on how Blackboard is being used, although either the details were a bit sketchy, or more likely, my notes are a bit sketchy.

 

The Keynotes 1 : Using the VLE as a Trojan Horse:  Grainne Conole

http://portal.sliderocket.com/BIVJM/conole_durham

 

First up was Grainne Conole, with a presentation on “Using the VLE as a Trojan Horse”. The argument is essentially, that the VLE can serve as a “nursery slope” on which academic staff can familiarise themselves with technology. That might sound a little patronising, but it is a fact that not everybody is at the same level of technology skills. The VLE also offers benefits such as centralised support and administration, development of consistent practice around teaching practices for example, online submission of assessments (and feedback thereon),

 

The idea of the “nursery slope” arises because there are lots of new technologies with apparently unlimited potential for new approaches to learning and teaching, but the reality is that the opportunities offered by social technologies, particularly those around peer critiquing, networking, openness, personalisation and user generated content are not fully exploited, and simply often replicate bad pedagogy. Of course that is a charge that might be levelled at the VLE itself.

 

It is true that many colleagues use a VLE as little more than a document repository, but that is hardly the VLE’s fault, and it seems to me that switching from one VLE to another is likely to delay the development of both digital and pedagogical literacy as colleagues familiarise themselves with the basics of a new nursery slope (Too frequent upgrades probably don’t help here either). But if you can keep your head (or at least your VLE,) People become more aware of the functionalities of the system, and will become more inclined to push at the boundaries.

 

Grainne then moved on to discuss some of the technologies that might be incorporated into the VLE  – content from mobile devices such as smartphones, study calendars, to pace learning, rich multimedia content, such as TED talks, self created podcasts and vidcasts, e-assessment exercises such as annotation tools, or online quizzes, social bookmarking and to do lists.

 

The Keynotes 2: The implications, meanings, and risks of openness in the digital academy: Ray Land.

 

Ray started with an interesting metaphor of the cloister. I’d never thought about the etymology of this before, but the word clearly shares a root with “closed” Academia is traditionally “cloistered”. Knowledge is enclosed in print, which is bound in books, which are stored in libraries. (In many libraries, the books were originally chained to the shelves). Not much openness there, and totally antithetical to the world we now live in which is characterised, not so much by openness, as by speed.

 

Digital knowledge however is constantly changing, shifting, being added to, and is essentially ungraspable. Even the concepts we use to talk about it are out of date. Have a look at this graphic showing what happens every 60 seconds on the Internet.

What happens every sixty seconds on the Internet
What happens every sixty seconds on the Internet

 

I suppose it’s a little disingenuous to present a global picture when most of us operate in a much smaller environment, (Planet Earth is quite a big place after all), but the case that there is no longer a stable body of knowledge in any discipline that can be mastered seems unanswerable. He then quoted the work of Virilio (1988) who argued that all technologies will ultimately fail, and of course the more connected they are,  and the faster the connections, the bigger the effect.  In other words, as Ray put it, the “21st Century Catastrophe, when it occurs, will affect everyone.” There will be no escape!

 

 

Putting the impending apocalypse to one side, we then turned to issues of how speed was affecting teaching. It seems unarguable that speed and quantity of information is antithetical the long established pedagogical techniques of discussion, thought and reflection. You can’t really have a debate on Twitter over a few weeks for example. Ray drew an interesting comparison with the slow food movement, which is primarily about people getting together to talk over meals, rather than gobbling sandwiches on the hoof. It seems to me though that the problem with “slow pedagogy” is that it’s very hard to step back from the complexity, or supercomplexity as Ron Barnett would call it, of the social media world. As university teachers we can’t really ignore it, and have to find ways of preparing our students for it.

 

Perhaps the biggest problem for academia is open text. There’s a lot of interest in Open Educational Resources, but the price of openness is the weakening of authority. That may be no bad thing, but, it is something of a threat to the traditional university. Ray gave examples of how degree programmes are changing – e.g. Coventry University now offers an 18 month degree “lite”, making extensive use of OERs (although evidently not those concerned with spelling!), and there are increasing numbers of web based learning organisations. That doesn’t mean the end of the University though. An interesting statistic is that 5% of the worlds population have had the benefit of a University education. 95% have not. Ray asked the question whether the cure for cancer, or a perpetually sustainable energy source was likely to come from the 5% or the 95%, if the 95% were given the opportunity! Apparently China is currently opening one new University a week. (Yes, you read that right! – A new University each week, not a new building).

 

Ray concluded by arguing that there is little doubt that being open carries risks. How do you ensure quality? Might individual teacher’s knowledge be marginalised by the changing “general intellect”? How do you ensure that knowledge is not misappropriated and commodified. by powerful technology corporations? As Ray’s other work has shown, academic knowledge is often “troublesome”. I’ve always felt that politics and commerce under capitalism have always been about pretending that there are simple solutions, and selling them on. In the end we need to think about how we can develop a new ethics of knowledge sharing and openness that acknowledges doubt and uncertainty, and most of all continue to research into the costs and benefits of open knowledge.

VLE data

First, I’d better come clean. This data isn’t mine – it’s from a publicly available spreadsheet produced by Matt Lingard, and you can get the full set at

and many thanks to him for doing that.
Following on from my last post, I thought it might be useful to get a sense of who was using what Virtual Learning Environment, (VLE) so I had a little play with Matt’s data, concentrating on UK Higher Education institutions only. (The full dataset does include some overseas institutions)  It seems that Blackboard is still the lead VLE in most UK Higher Education institutions. 

VLE Statistics United Kingdom Higher Education
VLE Statistics UK higher Education
Apart from Blackboard and Moodle, no VLE is used by more than one institution. (The “other VLE users” in this graph use either something they developed in house, or other commercial products). Given the constant complaints about the high cost of Blackboard, something else I thought I might glean from Matt’s data was how many institutions are changing from one VLE to another (or planning to change – the spreadsheet simply comments on some individual cases, so it’s hard to see how far a change is confirmed). In effect, that means changing from Blackboard to Moodle, since as far as I can tell from the data no-one is planning to change to Blackboard from either Moodle, or one of the stand alone VLEs. Here are the figures.
UK Institutions planning to change Virtual Learning Environment
Planned changes to UK VLEs
I was quite surprised that relatively few institutions were planning a change, most preferring to upgrade their existing VLE. There are significantly new versions of both Blackboard and Moodle this year, so this would appear to have been a good opportunity to change, but most seem to have contented themselves with upgrading what they already have. (Also, the phrase “doing nothing” in the chart legend is a bit misleading, since many of the Blackboard users upgraded last year.) Of course, it may be that institutions are contractually tied to Blackboard, which is preventing them from changing until the contract expires. It would  be interesting to repeat this exercise over the next couple of years and see if there are any changes to this pattern.

Shareville

Regular readers (yes, both of you) will know that I’ve been a little bit sceptical about the concept of virtual worlds in education in previous posts. That’s probably because World of Warcraft, Second Life, and so forth weren’t really designed for educational purposes so we’ve sort of adapted them. That’s not to say there hasn’t been some good stuff done in SL. I like Teeside’s Bayeux Tapestry sim in second life for example. But I was also impressed by Shareville, a virtual town, developed by Birmingham City University.

Shareville is a “virtual town” which was designed to help students prepare for learning in the workplace. You can navigate round the town using a grid based “map”. Clicking on a square will take you to a still 360 degree photograph of a district of the town, and by moving your mouse around the photo the user gets taken into interesting scenarios.  It’s perhaps pushing it a bit to compare it with things like Second Life, because you don’t have an avatar, it’s not a fantasy world – in fact it’s a rather grim view of reality! Technically I suppose it’s just a database. But it is expandable, so different scenarios can be added for different disciplines.  I also liked the way that Shareville was designed to be used in conjunction with other systems – no attempt is made to duplicate resources that might be in Moodle, Wimba or Mahara. Tutors put instructions on how to use Shareville in the VLE and users access that.
Anyway, rather than me going on about it, watch this presentation from the designers. There are also links for visitors to go and have a play with it.
While we’re on the subject of virtual worlds, I couldn’t resist this. I know it’s really just a game, but isn’t Lego about building a virtual world in the first place. So it’s a virtual world within a virtual world. A conundrum for the philosophy dept.