Using a conceptual framework to manage your data

Information overload
Information overload?

One of the problems with any research endeavour is that you collect a lot of data. Not just the primary data you get from your interviews and so forth (though, if you’re doing it properly, there’ll be a lot of that), Rather, I am referring to the ideas that you generate as you read the literature.

I think students struggle with this. I know I do.

If you just make notes at random, you will eventually have to organise them, and to do that you need an organising principle. All the text books suggest that you should have a “conceptual framework” in advance and try and relate your reading to that. “Conceptual Framework” is one of those phrases that researchers use, a bit like “ontology, epistemology and axiology” to frighten new research students.

I’ll try and explain. I’m currently interested in the way information is managed inside Virtual Learning Environments. The reason for my interest is that students are often heard complaining that academic staff use Blackboard, or Moodle, or whatever it might be “inconsistently”. So the concept of “inconsistency” is one element of my conceptual framework. When I come across something I’m reading that talks about this, I can make a note of the author’s argument and whether I agree with it or not, and why. I might even help myself to a particularly pithy quotation (Keeping a record of where I got it from, of course).

That’s simple enough, except that one concept does not make a framework. The point is that you have to have multiple concepts and they have to be related to each other. Firstly in creating my framework I should probably define (to my own satisfaction) what I mean by “inconsistency”. It might be a rather hit and miss approach to the types of learning material provided (e.g. on one topic there’s a PowerPoint, on another there are two digitised journal articles, on another a PowerPoint and half finished Xerte object), or it might be that one member of the teaching team organises their material in a complex nest of folders, and another just presents a single item which goes on for pages and pages, or it might be that one of a students modules is organised into folders labelled by weeks (When did we study Monmouth’s Rebellion, was it week 19, or week 22?) or by the format it was taught (Now, where did she present those calculations – was it in the “lecture”, or the “seminar”?). So for the purposes of organising a conceptual framework it’s not so much defining inconsistency, as labelling types of inconsistency. You might say they’re dimensions of inconsistency.

Also as researchers we try to explain things. So it’s likely that much of the literature will offer explanations. That’s another part of our framework then – explanations, or perhaps we’ll label it “responsibility”. This inconsistency might be the teacher’s fault, for being technologically illiterate, not understanding the importance of information structures, or just being too idle to sort it out properly. Another researcher will argue that it’s the students’ own fault, because that’s the nature of knowledge, and if they spent more time applying themselves and less time on their iPhones…. I’m being a bit flippant to make the point that there are always many dimensions of any conceptual framework. You do have to make some decisions about what you’re interested in.

Even if you do, your framework will get quite complicated quite quickly, but it is a useful way of organising your notes, and ultimately will form the structure of your thesis, or article, or whatever it is you are preparing. Nor will you need all of it. You have to be quite ruthless about excluding data. But I’m getting ahead of myself. I should say why we need a conceptual framework for note making.

One of the problems of making notes is that it tends to be a bit hit and miss. If you’re working at your computer, you probably have lots of files, (though you may not be able to find them, or remember what’s in them) but if an idea hits you on the train, or in the kitchen, or someone else’s office you might enter it on a note app on your phone, scrawl it on a post it, say something into a digital recorder, take a photo of it, or you might, as I do, rather enjoy writing into a proper old fashioned notebook. The result is that, conceptual framework or not, you have a chaotic mess of notes. To bring some order to this I recommend the excellent (and free) Evernote, (which is available for virtually every conceivable mobile device, and synchronises across all of them) and though I do like fountain pens and paper, Evernote is my main note making tool. (Incidentally this blog post started life as an Evernote note, as I was thinking about my own conceptual framework – I thought it would be helpful to my students to share this) As with any digital tool it is only as good as the way you use it. Which takes me back to the conceptual framework. Evernote allows you to create as many “notebooks” as you like, and keep these in “stacks”. Think of a filing cabinet full of manila folders as a stack of notebooks. But you can also add tags to all your notes which is a way of indexing your folders. (E.g. if you had a filing cabinet full of folders on inconsistency in VLEs, red paper clips attached to the folders might indicate the presence of a document indicating teacher responsibility for it, and a green clip indicate the presence of documents arguing about student responsibilities). Obviously with verbal tags you can have as many “coloured clips” as you like.
You do of course have to tag your folders consistently, and you have to bring all your notes together. No matter how good your digital note management app is it can’t really do anything about the folded post it note in your back pocket. So good practice for a research student, is, once a week, to bring all your notes together, think about your categories and your tags. (if you do use Evernote as I’ve suggested you will also be able to print a list of tags, which will help you develop a much more sophisticated conceptual framework)

Qualitative Research Traditions – teaching notes

Below is an edited digest of the notes I used for a teaching session I delivered this morning for novice researchers as part of our Researcher Education Programme.  They’re only an outline of the topics I covered and are designed to provoke discussion (which they did), but I thought students might find it useful if I wrote them up as a brief summary

Introduction
Perhaps the biggest problem in qualitative research is that it’s not really a single tradition. It’s really a group of traditions which have their roots in an exotic variety of academic disciplines, mostly in the social sciences and humanities.  Much of what we’re going to look at in this session can be traced back to research in psychology, history, anthropology, sociology, literature and philosophy).
Objectives
If you’re doing qualitative research it’s important that you are able to recognise these traditions  because you’re going to come across them in the literature, and you may wonder why the authors are asking the questions that they are asking and why they’re drawing the conclusions that they’re drawing from the evidence they cite. It’s also true that you yourself will approach your work from a particular theoretical perspective, and it will be very helpful to you to read others who share that perspective (and perhaps even more useful to read those who do not. Charles Darwin, for example. claimed that he was in the habit of making a note of every objection that occurred to him, or was brought to his attention, so that he was prepared to deal with any objection that might be raised). Also, when it comes to writing up, your examiners will be looking for where you are situating your work.
Divisions
We can break qualitative research traditions into 3 groups. Note what they have in common – they’re all very human (you don’t get many chemists working in the qualitative tradition!).  First you have the investigation of lived experience. That is, how do people, often particular types of people (experts, students, members of minority groups, workers in any given industry – whatever) experience life as they live it. (Note, not as how we, or the press, or anyone else thinks they should live it!)  How do they experience interactions with each other, and the world.  Second, there is the investigation of society and culture, which is characterised by studies of the way people come together into groups, which could be anything from a pub quiz team to a whole society, or dare I say a Researcher Education Programme, but also of how those social entities influence individuals thought and behaviour. Researchers in this tradition might look at rituals, values, customs and beliefs and how they are transmitted from members to external members. Finally there is the study of language and communication. Language here is interpreted very broadly. There are many languages other than spoken ones – We all know what a red light means when we see it outside crossroads, but it has quite another meaning in a disreputable back street!  So messages are conveyed by all sorts of things – a gesture, a logo, your clothes, a corporate livery – I could go on!
Lived Experience
Since my field is education, I’m going to concentrate on examples from that field, but these apply pretty  much across the social sciences.  Cognitive psychology is the study of the structures and processes involved in mental activity, and of how they develop as individuals mature.  Researchers in this tradition typically study how decisions are made, why people think the way they do. Research methods characteristic include protocol analysis, getting participants to keep diaries recording how they made decisions (or to think aloud as they do so) and comparing the results from different participants.   Life history is pretty much what it says it is. Researchers in this tradition argue that you can only really understand a person by studying as much of their life as possible. A researcher might shadow a subject, or conduct a number of extensive interviews with them discussing all sorts of things from their early childhood, to their family life, or observe them in a number of settings. They might also interview friends, colleagues and family.  Phenomenology  and phenomenography despite their similar names are quite different approaches. A researcher in the phenomenological tradition would identify a topic of personal and social significance, choose appropriate participants, interview each of them and analyse the interview data with a view to getting descriptions of the phenomenon as experienced by those who experience it from every possible angle. At the same time it attempts to give it meaning from outside the individual by looking at the structures that give it meaning.  Phenomenography in contrast is the study of how people come to hold different views. A phenomenographer might study a group of teachers to understand how they come to hold different views and deploy different techniques in classroom management.
Society and culture
Ethnography is the study of any given culture, the features thereof and the  patterns built into those features. Some of the most famous ethnographies are anthropological, (e.g. Margaret Mead’s work in developing societies) but it is a tradition that has spread across many social science disciplines as it can tell us much about how individual’s behaviours are determined by cultural values, beliefs and so on, secondly, it focuses on the emic perspective of a culture’s members and third on the natural settings in which individuals operate. Critical theory on the other hand starts from critiquing that environment. In education, one of the most important scholars is Paolo Freire, who pointed out that education only reinforced oppression unless it opened the eyes of (in his work) Brazilian peasants to the ways  in which they were oppressed. One such method of oppression was the education system itself which was simply reinforcing the codes and customs of the oppressive society in which they lived.  Freire argued that they had to take responsibility for their own education to overcome this.  There are no particular methods associated with critical theory, since researchers in this tradition would argue that you can only determine the method once you have deconstructed the situation you are researching and identified the sources of opression.  Finally ethnomethodology is the study of how we learn to behave in  situations, or the techniques individuals deploy to situate themselves in a situation.  One technique characteristic of ethnomethodology is called breaching – that is establishing what the  social rules are, breaking them and observing how people react to such rule breaking.
Language and communication
 Hermeneutics is the study of the process by which individuals come to understand the meaning of a text. As I said earlier a text can be anything that contains a message that can be read – it can be a document, but it could be an image, an outfit, or even a myth, or social custom.  Hermenueticists claim that there is no objective reality, only what we interpret. It follows from that any text must be informed by those interpretations, so we need to continually examine and re-examine the text in the light of its parts, and vice versa to get a true understanding of the author’s interpretations.  Semiotics is the study of signs, and it differs from hermeneutics in that researchers in this tradition argue that the message doesn’t exist until the sign is created.  They do share the view that anything is a sign, but they argue that signs are reflective of particular social realities. Language, musical notes, mathematical symbols, and yes, street signs are all objects of semiotic study. A semioticist would be interested not only in what the sign says but how it is written. Finally structuralism focuses on the systemic properties of phenomena – that is what is essential about some feature of the social world, and each feature can only be understood by examining its relationship to other features of the same system. Consider a textbook for example. It will have chapters, page numbers, an index, and will follow certain typographical conventions.  (indented text in italics would have no meaning on its own, but in an academic text it would almost certainly indicate a direct quotation)
 
Summary of part 1
Research  traditions are not paradigms and they’re not methods. They’re important in qualitative research partly because it is so varied. We couldn’t possibly cover all the different research traditions in 3 hours, and we certainly couldn’t cover all the methods. However, there is one research method  (not a methodology, note) that can site quite happily in any or all of these traditions, and it is a method that is very popular with students. That is the case study.  Now, you can make a convincing argument that all research deals with cases. Every experiment, every response to a survey is a case, but when we talk about case study research we generally mean the detailed examination of one or more instances of a phenomenon.  Which also raises one of the most profound problems for qualitative research:- If you accept that there is an objective reality external to our senses, then qualitative research has little to say about it. Put simply how can you make any claim to knowledge based on a handful of, or just one case study!
Case study
 Well, you can find one answer to that in the way you design your case study. As with any method, your research problem will play a very significant role in your choice of method.  What do you want to know about a topic? Is there some particular instance of that topic that that will tell you something about it. Is it a very good example? Is it a very bad example? Is it just a typical example with nothing special to recommend it? These are the kind of decisions that will inform your sampling strategy. While sampling tends to be associated with quantitative research, it is equally important in qualitative research, since you’re basing your claim to knowledge on that case. So either the case tells you something about the wider phenomenon, or you’re simply claiming that your knowledge is limited to that case (Which is by no means an invalid move for a researcher).  It’s particularly important if you’re trying to prove or disprove a proposition. (Logically disproving a proposition is the only valid scientific move since we can never say with absolute certainty what will happen next. Karl Popper famously posited that the proposition “all swans are white” would be fatally undermined by the appearonce of a black swan . For the same reason I can’t make negative claims (No swans are black). You can argue around this – for example  that the black long necked bird swimming in the pond outside the window is not in fact a swan, but you’re already on thin ice. You would have to say that being white was an essential property of being a swan).  But most research is trying to be more positive than that.  Many case studies are evaluative – that is they look at an instance of a programme or intervention and can show that x works in situation y.  Some other important things to consider with a case study are your own professional background and experience– in that sense a lot of case study researchers have a great deal in common with ethnographers, because it’s acknowledged that their presence in the case can have a significant influence.
Something that is often underestimated in all research is what’s called  “Entry to the field”. If you want to do a study of a particular case you have to get access. You have to identify gatekeepers, and get past them which can be really challenging, especially if you are dealing with a sensitive topic. (Try and do a case study of Mid-Staffordshire hospital and see where that gets you!) . All sorts of issue – even if you can use power to “force” your way in, you may not get the highest levels of co-operation!
Data collection in case study
This is where the case study shows its flexibility and also reveals its roots in ethnography. Characteristic of case study is that you use multiple data sources, but within a boundary which you draw around your case. (Cases aren’t always easily defined). The point is to reveal as much as you can about the case from different angles. There are nearly always interviews in case studies, but you might also collect documents, and observe people as they work within the case, take photographs, make notes on the settings etc. All this raises some very important considerations. First, one piece of data might prompt you to collect some more data. You might be told of a document in an interview, a document of whose existence you were previously unaware. So while you certainly go in with a plan, you may well go beyond it.  Data collection is therefore emergent in that it emerges from  your research. A second point is that you will collect a very large amount of data, so you should always record your interactions. Remember our recent discussions about metadata? Well, you should create metadata for all your interactions – it doesn’t have to be vastly detailed. But each interview should have a note of when and  where  it took place, the names of all present, any observations about the setting, (including your subjective impressions) and the main points that emerged from the interview. Similar records should be made for every other piece of data. You can think of them as manifests if you like (Shipping containers all have a piece of paper attached to them listing what they are supposed to contain, who sent it, and who is to retrieve it. This is called a “manifest”)
Finally, as you are generating so much data, it can be very difficult to know when to stop.  These four guidelines are helpful – First there’s exhaustion, (of data, not you)  which you can identify when you know your respondents won’t or can’t tell you anything knew, (and pull a resigned face as you walk into the room) you’ve read all the documents you can reasonably hope to read, taken all the photos, – well you get the idea.  Second, there’s saturation. We haven’t talked a great deal about coding in this session, but as you collect data, you begin to assign bits of it to categories which you will have derived from the literature, from your own thinking and so  on. This is a bit of a subjective judgement, but when you find  you aren’t really finding anything that adds new categories, or that you feel you have enough evidence to make the point about each of your categories that you want to make, then you can consider stopping. Similarly when you’re just finding the same things, so that your data is showing consistent regularities  (e.g. all the teachers are making the same complaint about the principal) you can stop . Finally you have to consider whether you are going beyond the boundary of your case and your research question – whether you are overextending yourself.
Generalisability
This is probably the biggest issue that qualitative researchers have to deal with and it’s particularly a problem for case study researchers. Postivists would say that you can’t generate enough scientific data from a small study, and certainly not from a single case. Well, let’s discuss that.
What about an extreme case? Flyvbjerg (2006) refers to a study of a petrochemical plant which was undertaken to see whether exposure to particular solvents caused brain injuries. The study was of a single plant which was highly commended for its health and safety practices? Why did the authors argue that the data they collected was generalizable?  Because if they found evidence of brain injuries among workers in such a plant, they were likely to find it in plants which were less assiduous about health and safety.  So an extreme case of a phenomenon is likely to provide knowledge about that phenomenon.  In the same article he refers to a study of the persistence of working class family relationships. The theory was that the strong family ties that were characteristic of the British working class would be weakened by increasing affluence. So the researchers picked on  a single town (Luton) in the late 1960s which happened to be very affluent.  They found that in fact the ties and relationships persisted, so they could then posit that these ties were not necessarily a response to economic hardship.  Of course there are other kinds of generalisations. There is a quasi positivist argument that the more case studies find the same thing about a phenomenon the stronger are our grounds for believing that thing.  There are also other claims to truth. Many case study researchers (and qualitative researchers in general) reject the basis of the positivist claim to knowledge and instead try and achieve credibility, plausibility, and familiarity. Case studies of professional practice for example, often ring true with their readers. But one criticism of that approach might be that you’re making the reader do all the work!
Pros and cons
I’d just like to finish with a list of pros and cons of case study research. Among the big pluses is, I think the fact that case studies tend to be very readable. There’s a bit of a debate about whether you should anonymise them. I tend to agree that reading about a real case has much more resonance, but that’s really very unusual. There are often compelling ethical reasons for anonymising your study and frankly you may not get access unless you promise to do so.  Even so, a well written study can still ring true with the reader. The fact that case studies follow the ethnographic tradition by trying to expose the emic perspective, that is the experience of what is like to be “in” the case is often helpful in this regard.  But as I said earlier, case studies can be compatible with many research traditions.
They are not without their disadvantages. First, is the fact that they are often very difficult to do (New doctoral students sometimes see them as an “easy” option. Really nothing could be further from the truth.) There is the difficulty of gaining access, of being sure that you are getting the full picture, and the fact that even if you do succeed in that endeavour, you will generate a large quantity of data, which will take a long time to analyse. We’ve discussed the generalizability issue, and I think it is worth repeating that generalisation can have multiple meanings. No, you can’t generalise from a single case in the positivist sense, but you can indulge in theoretical or analytical generalisation.
Finally, bear in mind the ethics of doing a case study. There are nearly always ethical problems in research, and case studies have a habit of hiding theirs until late. What do you do if you’re studying a hospital department for example and find high levels of incompetence? Who is your responsibility to? The research participants, or the hospital’s patients?  That’s not an easy decision, and actually ethical issues are usually less clear cut than that.
The session would normally conclude with a debate about the students own ethical problems, but on this occasion Drs. Kathleen Watt and Catherine Burge gave a presentation on practice led research.

so here they are.

Degrees in Second Life

http://devel2.njit.edu/serendipity/index.php?/archives/1117-Getting-Your-Degree-in-Second-Life.html

Well, I guess it had to happen. A college in Texas is offering what it believes to be the first degree offered via Second Life. I haven’t had a good look around  (the web site mentioned in the blog entry I linked to above is down) yet but I can think of all sorts of reasons why this might be problematic. Before I go into that, I do want to make it clear that I do think that Virtual Worlds like SL do have a lot of potential for educators (Yes, I do have an avatar in Second Life – Feather Congrejo, although I’m a fairly rare visitor these days)

So what are my reservations. Firstly, Second Life gives me a headache if I use it for any length of time. (Must be my aging eyes, but a colleague who attended a 6 hour conference in SL reported the same phenomenon!) Secondly, it needs quite powerful graphics cards, a requirement which seems to increase with every upgrade they produce, and I think that is a big accessibility issue. Thirdly, SL is a public site, and has, inevitably, some less than salubrious areas. (Quite a lot actually!)  OK, I suspect this is actually quite a small proportion of SL’s total facilities, and students in HE are adults and we can’t hold their hands all the time, but I can’t see any HEI relishing the prospects of misinformed local media announcing that it is directing students into what might be described as “adult” web services. I suppose you could get round that by using something like Open Sim http://opensimulator.org/wiki/Main_Page for a stand alone environment but you’d lose a lot of connectivity in doing so.

It also requires quite a lot of skill in building a properly immersive environment. It can be done, but it takes time and skill, and teaching in SL seems to require that quite a lot of time is devoted to orientation. (I suppose that’s a one off cost with each cohort of students though) The other issue is about how to devote sufficient time to each student, while continuing with Real World work.  I’ve always thought that one great advantage of technology enhanced learning is that it does allow the “quieter” students a chance to get involved. But there’s no getting away from the fact that it does take more time to deal with 30 problems or questions than it does to deal with the 5 or so assertive students in any class.

Teaching as stand-up comedy?

I saw this in the Guardian last Monday, and I think there are a few lessons in it that we might take on board.

http://www.guardian.co.uk/stage/2008/jul/21/comedy

Now, I have some reservations about turning everyone into stand up comedians, but I did like that last line about “I will never resort to PowerPoint in a lecture again”. It is important to interact with the audience, and I think we do often hide behind our visual aids. Anyway I’m always open to learning from strange new sources. But I’ll stop blethering on, and let you read and judge for yourselves.

Remodelling Teaching, Rethinking Education

CERD organised a one day conference on this topic today, and it proved a very interesting day indeed. I’m not going to say too much here, because we do intend to provide much more information about the day, including papers from the speakers via the web. From my point of view, the first presentation from Professor Mike Bottery of the University of Hull, proved particularly interesting. He was talking about the deprofessionalisation of teaching, or more accurately how teachers are moving away from being regarded as professionals (with all the rights to set one’s own agenda that that implies) to “branded technicians” – essentially people charged with delivering a set of specific competencies to meet a particular demand for a particular type of education. As this is my blog I’m going to reflect on the relevance of is to my own work, which is that this is precisely my concern about what we were being asked to do in the old TLDO. The whole agenda seemed to me that academics were seen as failing to come up with the goods, whereas in my view they quite obviously weren’t. (Also nobody seemed to know exactly what “the goods” were!) and we were faced with pushing a lot of unconvincing agendas about PDP, and skills for example that relatively few people seemed particularly interested in.  The challenge for the EDU is to reclaim its credibility as a professional support mechanism, and I think we are now going some way to doing that by communicating more with our own clients than with external agendas. (Not that the external agenda has gone away, of course.)  The last speaker, Michael Apple also picked up on this. issue, but he was much more concerned with how educational institutions engaged (or rather didn’t) with their communities. He gave the example of how communities in Brazil had incorporated street gangs, (who previously had been excluded, not altogether surprisingly)  into local decision making processes. Clearly that’s an extreme example, but he did suggest that Universities tend to exclude a lot of people who are absolutely essential to their work, (building, catering, gardening, secretarial, staff and so forth) from decision making processes, and they might benefit from a more inclusive approach.  Coincidentally I had occasion to visit another University recently where I noticed that the development unit formally made provision for these staff, and the development programme was structured in the same way as it was for everyone else. Well, it’s not much but it’s a start.

The other two sessions, were a very interesting debate about Rethinking Higher Education presented by Professor Mike Neary, of Lincoln and Dr Glenn Rikowski from Northampton, and a session on workforce reform, social partnership, and the construction of consensus. This last was very much about the research into Trade Union involvement in workplace remodelling in schools, and in truth I didn’t feel I had, or have a lot to bring to this debate. (A deplorably instrumentalist attitude no doubt, but there you are!)  On the other hand, the Rethinking HE session was quite thought provoking, arguing that universities should be the sites of co-production of critical knowledge on the part of both of staff and students. I don’t disagree, but I do worry about the replacement of one orthodoxy with another. Mike was talking about the notion of Mass Intellectuality, or Marx’s notion of the general intellect. The latter gives me pause for thought. I don’t think Marx meant any sort of singular Orwellian “newspeak” or “new intellect” but it’s easy to be interpreted that way. I suppose the same goes for mass intellectuality, but at least that seems to me to accomodate multiple viewpoints.  I think I just have a natural antipathy to anything that smacks of mob rule, and am  rather uneasy with anything that  might facilitate it.

The other thing I was a bit dubious about was beginning with the quotation “We work but we produce nothing” which apparently comes from the student revolts of 1968. But that falls into the trap of believing that corporeality is an essential property of “something”. Work always produces something – even if it’s just a headache! In this case I find it hard to believe that the students’ work did not produce at the very least  a new sense of self among themselves.  (and that quotation, come to think of it!).  There’s a lot more to think about here, though, and I think I need to take it to my research blog for that kind of reflective consideration.

Where does CERD go. Well, we’ve taken some steps towards working with students. Perhaps we should start to give some thought to the needs of the wider university workforce. Let’s face it without the catering staff’s coffee the place wouldn’t run at all!