RG3 Analysis

Research Group 3 Data

Several questions were posed to an invited student group drawn from one metropolitan UK university (for more details please refer to the Methods page, and also the *Participants page, which is password protected), and consequent discussion took place over a period of approximately 1 month. While it would have been more ideal to have a wider selection of students taking part, this was more problematic than initially envisioned. Of the eight students who were asked to participate, four engaged in some depth and provided a fair amount of useful data.  This was then categorised and placed into equivalent Literature Themes. The Contextual Categories developed from RG2 data were used to grade context for responses, and then a ‘problem’ or ‘benefit’ tag was also allocated. 

Questions and Topics

Questions asked, or topics highlighted as discussion kick off points were:

  1. “Do you think staff know more or less than you do, about technology?”
  2. OPEN chat covering tech skills/knowledge and ideas, whether staff have imagination enough to use technology, and how they use it
    1. “Do you expect more in the way of how technology is used in your teaching and learning?”
  3. (Some) points to consider:
    1. Could our digital library service be much better, if so how?
    2. Do you expect more in the way of provision of technology to help with your studies – if so, what would you like to have?
    3. Do you wish you could log in once, for everything?
  4. Quick questions about your thoughts on Blackboard, and how it is used. Give feedback on any aspect or all aspects – whatever takes your fancy:
    1. Do you think course material is good quality, well organised and easy to find?
    2. Do all lecturers provide material in Blackboard?
    3. Do all lecturers provide material in Blackboard in a timely manner, i.e. in time for class, or revision, for example?
    4. Do you think material in Blackboard adds to the value and quality of your learning?
    5. Do you think Blackboard is easy or hard to access?
    6. Do you think Blackboard is up to date or old fashioned?
    7. Have you ever used a discussion forum in Blackboard?
Table 1: showing RG3 collated responses into category types

RG3-chart1

Fig 1: showing RG3 Category response allocation (percentages)

 

Theme and Contextual Category Allocation

Questions were generally not strongly influenced by the other research groups work or the literature review findings, though there was some intention to create a relevant set of responses to be able to contrast perceptions of students and staff around a number of key topics. It was also the intention to not control too directly how students would respond, so topics and groups of questions were asked, rather than a straightforward question/answer procedure, the belief being that this would produce more authentic responses.

Clearly, from seeing the responses which naturally evolved, several of the top 6 themes are present, and overlap between main  response categories. Refer to the table below for how response categories equated to themes present. Responses were then  interpreted according to best fit for contextual categories derived from RG2. In order to achieve this, all contextual categories were adopted as relevant to interpretation. In the case of students, ‘Expert Knowledge’ was regarded as personal experience or actual events being recalled. The ‘problem’ or ‘benefit’ allocation was made on the predominance of positive or negative comments.

 

Response Category Literature  Themes In top six themes Contextual Category allocation PROBLEM OR BENEFIT
Skills and Know How Skills and Training No IAL-H, EK, HS PROBLEM
ICT and Elearning support Yes
Quality of Learning Materials & Learning Learning Quality Yes EK, IAH, PA,HS PROBLEM
Learning Design Yes
Student Centred Learning Yes
Provision Perceptions Institutional Support Yes IAL, EK, PS PROBLEM
ICT and Elearning support Yes
Student Centred Learning Yes
Provision Expectations Institutional Support Yes P&C, IAH-L, EK  BENEFIT
ICT and Elearning support Yes
Student Centred Learning Yes

Table 2: showing RG3 responses categorised with Lit theme relationship and contextual category ranking scores. Also indicates overall problem or benefit context of responses.

 

Problems and Benefits Hierarchy v 4

The fourth iteration of the Problems and Benefits Hierarchy can now be compiled to include the final set of data from RG3.

Table 3: showing RG3 contextual ranking scores added to RG2 and Lit theme contextual ranking scores

 

Problems and Benefits Hierarchy Overview

The ‘overview’ final Problems and Benefits Hierarchy is available below for reference (for more information, see the Findings section).

Table 4: Problems and Benefits Hierarchy Ranking Overview

 


 

* The information listed in the Participants page is password protected for reasons of individual privacy for those who took part. The information can be made available but only for purposes of evidencing that real people took part in the research.