Category Archives: Review

The many flavours of SCORM

This post documents my investigation into getting SCORM packaged content to play in Blackboard and looking at what we can learn about the learner’s interactions via the Grade Centre.

There is often a need for a simple way to insert learning materials created outside the LMS (Blackboard, Moodle, Sakai, etc.) into it. These could be materials produced by a third party – perhaps supporting a published text book, or a custom training course. It is at this point that people usually start talking about SCORM.

What is SCORM?

SCORM (the Sharable Content Object Reference Model) although not a standard (sensu IMS, etc.) is the nearest thing there is to one. Whilst newer formats such as Tin Can (also called the Experience API) offer more possibilities, at the time of writing very few LMS vendors have implemented these sufficiently to make them any more functional than the latest flavours of SCORM. The list of Tin Can adopters published by Rustici is impressive, but may perhaps be best seen at present as a list of those working to implement it fully (see for example these posts regarding Sakai, Blackboard and Moodle). Apologies in advance if there are more recent developments that I am not aware of!

The trouble with SCORM is that it is not a single thing. In fact it is a wrapper around a series of existing formats, and you can meet this reference model in several different ways. Most people using SCORM today use one of these “flavours”:

  1. SCORM 1.2
  2. SCORM 2004 – which is available in 4 editions, edition 4 (2009) being the latest.

Rustici have published a very useful set of SCORM resources. In this post I am going to test some custom content created in Adobe Captivate and export it using a range of SCORM flavours, importing the resulting file into Blackboard. I will then look at the reporting options available and how these measure up to those for a standard test/quiz. Technical Information

  • Adobe Captivate 8.0.2.266 running on a MacBook Pro with OS X Yosemite
  • Blackboard Learn 9.1 October 2014 release (CU2)
  • Building Block: Rustici SCORM Engine 2013.4.1164954*

* Blackboard released a new version of this Building Block as I was part way through these tests. I have learnt the hard way not to apply these too quickly.

The Content

I created a simple object (which I called Vanilla) which uses a range of different question types. Some of these are drawn from pools, some are hard-coded into the object. In any run through, the user will be asked to answer eight questions, each worth 10 points:

  1. Multi-choice (single answer) question – 1 drawn at random from a pool of three questions
  2. True/False question
  3. Fill in the blanks question
  4. Matching question (three heads to three tails)
  5. Image hotspot question
  6. Ordering question (arrange the 7 steps of a process in the correct order)
  7. Multi-choice (single answer) question – 1 drawn at random from a pool of three questions
  8. As 7, with a different question drawn from the same pool.
Creating the Quiz in Captivate
Creating the Quiz in Captivate

Scoring Options

Before exporting anything, you need to define how a person passes the test. This is defined in the project Preferences. File | Project Info and then click on the Pass or Fail option down the LHS:

Scoring Options
Scoring Options

You can see here that we have set the pass rate to be 80% (pretty high). There is also the option to choose a numerical score as the threshold. The other options (relating to what happens if you fail and the ability to retake the quiz can be ignored for the purposes of this test). The next step – deciding what to report back differs depending on the SCORM flavour you use…

Export Options

A SCORM object can report (to the LMS, usually via the Grade Centre or equivalent) different measures, depending both on the SCORM flavour you use and the way you set it up. Typical fields it can report are:

  • The Completion status (how many of the pages of content you actually viewed). When a SCORM course is launched, the status changes to Incomplete. It changes to Complete when the user meets certain criteria – the Completion Criteria.
  • The Score of any quizzes taken
  • The score and/or the completion status can also be used to derive a Pass/Fail flag
  • Ideally, you also want it to provide details of the Interaction – how long people spent on each page, which questions they attempted (remember these may be drawn at random from a poll) and the answers they provided.

In the simplest case, the Completion Criteria might be just that the user has launched the package. It is more commonly defined as either viewing a certain number/percentage of slides and/or achieving a pass score in the quiz. In order to export content as SCORM you need to adjust the project Preferences. File | Project Info and then click on the Reporting option down the LHS:

SCORM 1.2

If you select SCORM 1.2 under the Standard select control the rest of the page looks like this:

Configured as per suggestions
SCORM 1.2 settings Configured as per suggestions

You can see that you can choose to simply report that a user’s interaction with the object is complete or incomplete, or go for the slightly more nuanced incomplete/fail/pass. In the second step – Success/Completion Criteria you can define whether you want to define a Pass in terms of User Access (either simply a percentage of slides viewed and/or passing the quiz). If you click the Configure button, you can see the values that will be written into the SCORM manifest file (these values are used when you deploy the content into the LMS):

configure scorm1.2 Note that you only have the option to export the 3rd Edition flavour of SCORM 1.2 in Captivate. It also only supports a single SCO (Sharable Content Object) within a project (actually the entire project), even though SCORM itself allows several. This restricts the reporting options, although in most cases a single SCO is adequate. The default Advanced settings are used in these tests:

Advanced settings
If in doubt leave these as they are

SCORM 2004

If we select SCORM 2004 under the Standard, the page changes slightly: prefs The Status Representation options seen in SCORM 1.2 disappears and instead we have the option to define separate Completion and Success criteria. This allows for the same level of reporting as the earlier incomplete/fail/pass option: incomplete = Completion criteria not met, Success criteria not met; fail = Completion criteria met but Success criteria not met; pass = both Completion and Success criteria met. You can define both criteria using test scores, number of slides viewed or just opening the package. In my case I have opted to define completion as viewing a certain number of slides and success as passing the test. Again there is the option to report the Interaction Data. If you click Configure, the dialogue displayed is similar to SCORM 1.2 but not that the user has the option to select their preferred version of SCORM 2004 – one of the 2nd, 3rd and 4th edition.

configure

I chose to use the latest one. We left the Advanced settings at defaults (as with SCORM 1.2). For testing I published three SCORM 2004 versions, one for each edition available in Adobe Captivate.

Publishing the Content as a Zip File

Regardless of SCORM flavour, the process of publishing the content is the same: File | Publish…

You can see that this file will be deployed as HTML rather than a SWF
You can see that this file will be deployed as HTML rather than a SWF. The chosen SCORM flavour is shown at the bottom right.

Note that the option to Zip Files is ticked. In summary, at the end of the tests I had four SCORM packages:

  • SCORM 1.2 3rd Edition
  • SCORM 2004 2nd Edition
  • SCORM 2004 3rd Edition
  • SCORM 2004 4th Edition

Deployment Options

The content was added to a new Blackboard course as a Content Package (SCORM) – this is the custom content type defined by Rustici’s building block:

Adding the SCORM package
Adding the SCORM package

You are prompted for the location of the zip file you published:

Click Browse My Computer to locate the ZIP file
Click ‘Browse My Computer’ to locate the ZIP file

after selecting the file, I clicked Submit to upload it to Blackboard where the manifest file is parsed. The first part of the form reads the descriptors from the manifest and asks you to confirm the details displayed to students: add step1

Note the fact that the Title and Description values are taken from the manifest using the values you typed into the Configure box above. The second section determines how users can interact. After thanking the developer for selecting “No” as the default option for Make SCORM Available (can never get enough radio button clicks) then it’s down to the attempts. For these tests I am going to allow an unlimited number, without any date restrictions. There is no sense in turning tracking on as you will get much better data from the SCORM object itself when you open the Grade Centre. add step2

The third section is where you define the integration with the Grade Centre (unless you choose the option No Grading). Note that even if the content doesn’t have a quiz baked into it,  you could set a simple score related to completion that would indicate the people who had worked through it. Blackboard users note, the Title setting here is what is used to name the Grade Centre column. Given the multiple flavours of SCORM that I am testing, I will have to be careful here and use one of these names each time:

  • v1.2
  • v2004e2
  • v2004e3
  • v2004e4

Keeping them short is a vain attempt to reduce the scrolling in the Grade Centre. I’m selecting to use the SCORM Score as the Grade, so I will see how well people have done in the quiz. The Grade SCOS (which should read Grade SCOs) section can be left at No as this captivate content can only ever have one shareable content object, so there is no need to provide separate scores for each part. I am leaving Grade Timing at showing the Grade of Last SCORM Attempt (though you can get them all via the Grade History). Screen Shot 2015-07-20 at 14.31.05

Finally clicking the Submit button creates an entry in Blackboard

Screen Shot 2015-07-21 at 16.28.35

When you click the link, the chances are out the box it will not behave as you wish. In my case, it opens as a new popup window that is not quite the right size:

Opening in a pop-up
Opening in a pop-up – note the scroll bars 😦

To sort this, you need to edit the content again in Blackboard and make use of a feature that only appears after initially submitting it and does not conform to the Blackboard UI style manual (oh yes there is one!) Below the SCORM AVAILABILITY entry there is now a new entry ADVANCED OPTIONS:

Screen Shot 2015-07-21 at 16.32.36

If you change the radio button to Yes, a whole heap of extra options are displayed

The first set relate to the Navigation Controls – these can be important if you want to show/hide the course structure down the left hand side, or display a Close button at the top of the screen:

Screen Shot 2015-07-21 at 16.34.05

Clicking any go the other text below the radio button, e.g. Launch Behaviour gives you more options.
This is where you can set the size of the opening window. The page helpfully highlights any changes in a pretty shade of rose pink:

Screen Shot 2015-07-21 at 16.45.58

That’s it sorted now.

I then repeated the process for the three SCORM 2004 flavours.
Note that as far as configuration within Blackboard goes, in this case the same settings can be applied to all four flavours.

Note, later I explored these settings in more detail. I came across a variable that sets the SCORM edition under Compatibility Settings:

How important is this?
How important is this?

I was not impressed that this was not set correctly by default for the SCORM 2004 items – it seems to default to the 2nd Edition 😦
On the up side, changing this did not seem to make any difference to the content item’s behaviour.
The SCORM 1.2 did not have this setting, so at least it detected that correctly!

Completing the Tests

I then logged in as a student and completed the four SCORM objects. You could not tell the difference from the user perspective.

I will come back and look at the experience using a range of devices (laptop browser, mobile browser and mobile app).

Logging back in as an instructor, I can see the entries for each SCORM item in the Grade Centre:

Screen Shot 2015-07-22 at 11.11.21

To see the detail of any given attempt, you have to select it using the context menu at the right hand side of the entry:

see an attempt

To cut to the chase, there is a difference in the data displayed from SCORM 1.2 and SCORM 2004 items, but no difference between the various SCORM 2004 flavours:

SCORM 1.2 2nd Edition
SCORM 1.2 3rd Edition
SCORM 2004 2nd Edition
SCORM 2004 2nd Edition
SCORM 2004 3rd Edition
SCORM 2004 3rd Edition
SCORM 2004 4th Edition
SCORM 2004 4th Edition

The key advantage of the SCORM 2004 format is that you see the text of the answer selected by the user, rather than just an identifier (e.g. a, b, etc.).

The utility of the response varies by question type. For text responses it is easy to see which response the learner chose and whether it was correct, but for some (e.g. hotspots, where just the co-ordinates are displayed) or ordering (where you need to parse (URLDecode) the entries selected) you need to pay much closer attention.

It is interesting to see how long a student spends on each question.

That’s about as good as it gets though. There is no overview available across all the students on the course, no ability to analyse the effectiveness (discriminatory power) of the individual questions.

Item Analysis

Compare the results above, with the sort of analysis that comes out the box for the much-derided Blackboard Quiz:

From the Control Panel in a Course:
Course Tools | Tests, Surveys and Pools | Tests
and from the context menu next to the corresponding Test choose Item Analysis

Item Analysis

Running the analysis can take some time (especially if the class contains a lot of students) but the results are worthwhile.

In the screenshots below I am displaying Item Analysis results for the same questions used in the SCORM package, but deployed as a Blackboard Test.

The Item Analysis starts with a summary of the test, how many people have taken it, and an indication of how individual questions score in terms of difficulty (the percentage of students who answered it correctly) and how discriminatory (the ability to discriminate between students who know the subject matter and those who don’t based on the response to this question) they are.

Screen Shot 2015-07-22 at 12.12.17

 

Below the table is a list showing the results for each question in detail. You can filter this, e.g. to show just the questions with a poor discrimination value:

poor questions

 

After applying several filters to get a feel for the data, you can drill into each question to see the statistics behind these measures:

fair questions

 

In this table we see columns of figures. The first shows the breakdown of responses across all learners. E.g. 3.87% of all the people who worked through this material thought (wrongly) that Vanilla translates as “Nectar of the gods”.

The remaining columns show the responses selected for four groups of learners, grouped by their final score. So we can see that none of the people whose final score placed them in the top 25% (Q1) chose the option “Nectar of the Gods’, whilst 15 of the people in the bottom 50-75% (Q3) selected it, and 6 of those in the lowest 25% (Q4) chose it. This information is very useful when you come to review the questions.

It would be great if this information was available from SCORM content too, but at least in the Blackboard implementation, this information is not available.

Recommendation

I think SCORM can be used to deploy custom content, to convey information, introduce new ideas and help students assimilate this information into their own mental model. If you are only interested in the final numerical result, then it makes sense to include a test within the SCORM content. SCORM 2004 provides better reporting than SCORM 1.2.

If, however, you want to be able to refine the questions and possibly tune the training materials based on repeated wrong answers, then I think at present it is better to decouple the quiz from the SCORM object and deploy it as a separate stand alone test (possibly only visible once the learner has completed the SCORM packaged content at least once). This also gives you more flexibility should you want students to retake the test later and possibly revisit the materials if they fail the test the second time.

Device-Dependent Display

I also wanted to test the way the content was deployed using a range of configurations. The Captivate project used a responsive template, so the content uses one of three defined layouts, with break points defined at display widths of 1024 pixels (desktop), 768 pixels (tablet) and 360 pixels (mobile):

Desktop Layout
Desktop Layout

 

Tablet layout
Tablet Layout

 

Mobile Layout
Mobile Layout

To test this I launched the course using a laptop and a tablet. I considered using  a mobile phone too, but I think this form factor is really too small for much e-learning content. For the laptop I wanted to see the effect of resizing the window. For the mobile devices, I wanted to see if it reacted to a change in orientation.

Standard Browser

The link appears as a normal content item:

No surprises here :-)
No surprises here 🙂

Clicking the link causes a new window to open. The SCORM content plays in this, filling the screen. The content of the window underneath is repainted to show that the SCORM player has launched in a new window.

laptop launch
It is also possible to display a title bar across the top and the table of contents down the LHS if you want – they are not appropriate for this content.

When you have worked through the materials and close the SCORM player (And so it’s window) , there is a short delay, then the remaining window is updated, returning you to the content items again.

I then tried resizing the popup window to see how the content layout responded.

When I started narrowing the window width, the content did resize to a layout more like the Tablet option set in Captivate:

The Start button doesn't look quite right
Some of the text has not flowed particularly elegantly, but it is all visible

When I continued to reduce the width, it eventually flipped to the Mobile layout:

Perhaps surprisingly, this factor appears better
Perhaps surprisingly, this factor appears better

As such, although with a few layout issues, the SCORM content is acting in a responsive manner when viewed on a desktop/laptop.

Mobile Browser

This test was carried out using an iPad, accessing Blackboard using the built in Safari browser:

Very similar to the laptop experience at present
Very similar to the laptop experience at present

Things didn’t quite go as expected when I clicked the link to open a SCORM item. Instead of a new window I got this:

Safari didn't like the popup window
Safari didn’t like the popup window

If I then clicked the Launch Course button the player is launched.
Unlike the desktop browser though, the opening screen is intercepted with this standard Captivate screen designed to provide help for mobile users:

The icon at the top right opens guidance for using gestures
The icon at the top right opens guidance for using gestures

If you click on the Gesture icon, these instructions are displayed:

Handy as they are not all immediately obvious
Handy as they are not all immediately obvious

Click the screen again to dismiss this information, then click on the player icon to finally launch the content:

Launched in landcsape
Launched in landscape

If you rotate the tablet, the content is flows to (at least partially support) the portrait orientation:

After rotating the tablet
After rotating the tablet (note the larger Start button, because we are now below the 1024 pixel width setting)

 

When you have finished working through the content and close the SCORM object, you are returned to the course as before.

The popup blocker annoyed me, so I went to the Settings for the iPad and searched for the options for Safari. Sure enough there was a Block Pop-ups option:

There it is - Block Popus
I switched OFF this option, as shown here

With this set duly to OFF I tried launching the content again. I expected the file to open, but no, there was a further surprise. Now I was presented with a dialog box asking me to conform the popup.

A pleasant surprise
A pleasant surprise

After clicking Allow the content launched as before, showing the same special Captivate mobile screen:

The icon at the top right opens guidance for using gestures
The icon at the top right opens guidance for using gestures

 

When the content is displayed, it is laid out respecting the orientation of the device:

Landscape
Landscape

and if turned:

Portrait
Portrait – it has adapted to the narrower width, but not extended vertically.

Thus, with a bit of persistence, it is possible to play SCORM content using a browser on a tablet (or phone if you have a very high resolution screen and good eyesight).

Mobile App

The final tests used Blackboard’s Mobile Learn app.
The content is accessed by navigating a series of nested menus:

Ah the joy of magenta
Ah the joy of magenta

When you click on one of the links to the SCORM packages, the app does not know how to handle the content, so you are palmed off to the built in browser:

There's often quite a wait at this point...
There’s often quite a wait at this point…

but eventually you get this

Pop goes that weasel
A familiar page – note that this browser has ignored my Safari settings to allow Pop-Ups

If I click the Launch Content button the result is again, not what I expected:

What
It seems the built in browser essentially closes the content on load, and so your SCORM session terminates abruptly.

If you repeat this cycle and click fast enough, you can end up at this page:

Would they really want an email from me?
Would they really want an email from me?

I also tried updating the launch settings in Blackboard to do everything I could to avoid the popup:

No joy
No joy

Result: still no luck with the Mobile App.

As such I have to flag SCORM content as incompatible with Blackboard’s Mobile Learn app at present.

Testing Times for the Under Tens

When my son came home from school with a list of one hundred words to learn (thanks Michael Gove) I wondered if some technology could help. Somewhat hesitantly I started searching Apple’s App Store hoping to find something that wasn’t tied to a US-English dictionary. My search turned up a range of apps, the one I settled on was Super Speller by a husband and wife software team – Quiet Spark. One of the reasons for this was their sensible approach to privacy and an absence of adds (well worth paying £1.99/US $1.99 for).

Creating a test is child's play - literally in some cases!
Creating a test is child’s play – literally in some cases!

Don’t let the clean interface of this app fool you into thinking it is too basic. It is deceptively powerful. Essentially you create a series of tests by typing in  words, then use the iPad or iPhone’s microphone to record yourself saying them.  That means your children will hear the words spoken in the local accent. So far so good…

Spelling test
One Week’s Words

The list supplied by (to?) the school doesn’t just contain words that are tricky to spell (like achieve or rhythm) it also contains words that sound alike. The question that initially troubled me was how can you use an auditory cue to help the listener differentiate between the potential responses? That’s where (with a bit of lateral thinking) this app excels. Rather than just saying the word and stopping, you can follow it with an explanation, e.g. recording the phrase “aloud – as in speaking out loud” or “allowed – as in permission to do something”. This way the meaning of the word as well as its spelling can be reinforced each time the test is taken. Equally, you could include it in a sentence and say something like  “Spell allowed, as in ‘you are not allowed to pick your nose'”.

Once a test has been set up, there are a range of delivery options. Most are what you expect – the ability to shuffle the order, ignore capitalisation and, if you really feel the need, to set a time limit. Something that isn’t part of the enterprise testing solutions I am used to (think QuestionMark, Blackboard, Moodle, etc.) but perhaps should be is Super Speller’s Smiley hints option. Essentially this feature adds a Smiley at the top of the screen that provides the user with regular clues whether or not their spelling of the word is on track. This is particularly useful when learning a new list of words. Whilst helpful, achieving full marks in a test using this feature means you miss out on the reward offered under “full test conditions” – a screen full of balloons to pop.

Check your spelling - the Smiley hint isn't smiling any more!
Check your spelling – the Smiley hint isn’t smiling any more!

The app also offers a Study the Test mode, where a link is added exposing the iPad’s dictionary. Just remember to set up the appropriate language for your iPad and enable/disable the dictionaries before hand!  The app will honour these settings – an important feature as it should be your teacher, not the device that has the last word in how a word is spelled.

A definition is offered using your default dictionary
A definition is offered using your default dictionary

Unlike some apps designed for mobile devices, this one supports multiple students, making it great for families who have chosen not to issue everyone with their own device. Before you take a test, you are prompted to enter your name, and the results are saved against your name.

Often it can help to add a few words not on the test list. This doesn’t need to be an attempt to trip them up – inserting the name of a favourite toy or TV character can add a bit of light relief and remind them that learning should be fun!

The app has a lock option you can use to prevent access to the Manage Test (a.k.a. the See the Answers) page. Whilst locking it down might initially seem appealing to parents, if you leave the app unlocked, then children can have fun making up their own tests, challenging each other (and who knows, even their parents!) Creating extra tests has proved much more of a draw to my children than the built in word search and scrambling tools (though your results may vary!) It also provides some insight into the breadth of their current vocabulary and a chance to pick up any misunderstandings or mispronunciation early on.

Test Results - showing the alternative whiteboard display option
Test Report – showing results of each attempt. This screen shot has been taken with the app in its alternative whiteboard skin, in case you are not a fan of the blackboard look.

It provides good reporting tools if you want to check on your children’s progress – you can step through the responses in each attempt. I’ve yet to explore the tools for sharing tests with others via email, but I can see the advantage, particularly if I was a teacher wanting to use this for practice in my class.

This app was written by parents to help their own child and I think this focus on making it appealing to children is the key to its success.  Only time will tell whether the balloon popping will retain its appeal with my children, but Super Speller has already proven to be a good way of getting them to complete their literacy homework. If I could change one thing, I’d like to add the ability to record an introductory or congratulatory video clip for a test, to make it feel even more personal.

CC BY-SA Image: No Technology in Brighton – taken by Sammy0716 and shared on flickr using a CC BY-SA 2.0 license.

What was (and wasn’t) said in Vegas

BbWorld was different this year, the second with Jay Bhatt at the helm.  This was most obvious during the main keynote/road map presentation. Gone was the rapid pace, regular clapping, the “cult of Chasen”, Bhatt was noticeably lower key, with some of the audience left waiting for some “killer” new announcement that never came. Indeed, some people who should have known better tweeted that the keynote was simply announcing a repackaging of nothing new. Bhatt undersold himself.

The keynote hid several major changes amongst a wide-ranging discussion about modern society, it’s ills and why the current education system won’t fix it. Needless to say this was delivered though the spectacles of technological-determinism and left me and many other members of the audience feeling a little edgy.

Moving on to the hidden changes, the first was a rebuilding of Blackboard’s wide range of products into a series of bundles. These will differ slightly for the K12, Higher Education and possibly the professional education markets. From my hurried notes during the conference these are:

  • Learning Core – essentially Learn plus the Community System, Content System (including xpLor and portfolios) the social/cloud tools and some form of mobile access.
  • Learning  Essentials – adds Collaborate and possibly some more features from Outcomes
  • Learning Insight – adds Analytics for Learn and also Outcomes for Assessment
  • Learning Insight and Student Retention – not the most imaginative name, nor the clearest definition, this seems to add some extra student retention features to complete the bundle.

More details are available Blackboard’s website.

Why is this important? Well Blackboard has just raised the base offering, essentially when this is rolled out, everyone will have at least the traditional academic suite (the ‘deathly hallows’ of learn, community and content). This should make it easier for the company to support the product (effectively by giving the product catalogue a long overdue haircut) but also much easier for users – the help documentation will finally apply to all users and we can get rid of questions such as ‘do I need the community/portal part for this to work?‘ It should also make user experiences more shareable and transferable. Anything that removes divisions between the user community is a good thing.

Secondly, a new user interface was demonstrated. This was a working prototype, accessing a standard learn database (if such a thing truly exists!) but using node.js to render much of the page content client-side. This makes the interface appear much more responsive and allows Blackboard to match the end-user rendering speed of other solutions such as Canvas. By shifting much of the processing work onto the client side, it also helps the core Blackboard product to become more scalable. The use of client-side just in time rendering also offers the possibility for much better reporting/learning analytics. A problem with building web pages in server memory and then sending them out to the end user is that you never know whether they saw the bottom of the page (or even the middle of a long page). If it is rendered on demand – e.g. in response to the user scrolling down – then we can record the fact that the information was at least actually displayed on screen to a person! In conversations I had with Stephanie Weeks she confirmed that this fact had not been lost on Blackboard either.

This is one of several signs that Blackboard may finally be able to lever their market dominance and vast range of products for the better. Hidden in the programme was a ‘State of the Union’ address by SVP Gary Laing. He began by sharing rather too much of his life story and desire to work with Blackboard to reimagine (re-engineer?) education, but then thankfully he talked about key changes that are occurring behind the scenes. Coming in to the company with fresh eyes he has seen the results of Blackboard’s aggressive takeover and merger approach: multiple product lines, often with a degree of overlap, running on different hardware, often based in different parts of the world, written in different languages by individuals in different teams, often with their own definitions of what should be common terms defined (and hopefully stored) only once – users, departments/schools, institutional roles, term dates, etc. Laing showed us how these teams and products should be rearranged so that features like analytics and mobile feel built in rather than bolted on. He challenged us to think about SMAC – social, mobile, analytics and cloud (note this could be re-arranged as SCAM). These are ideas he wants us all to bring back to our home institutions.

Then, a third Blackboard hosting was offered – as well as self-hosted and Blackboard’s current managed-hosting, there is to be a third multi-tenant option currently referred to as a public cloud solution. This looks like an attempt to play catch-up and stem the loss of clients with limited budgets to cheaper cloud-only solutions (particularly Canvas). It is unclear how building blocks will fit into this model and how much freedom individual clients will have to select or write their own.

Indeed there is much still to work out. What will the new pricing structure look like? How will building blocks be able to exploit the new Ajax user interface?  How many clouds can Blackboard manage? There were also some noticeable omissions – both the community and content systems were effectively ignored during the conference. Have they a place on the roadmap?

I think there are many reasons for hope from BbWorld14 and much for both Blackboard and the staff and student users to learn. It was great to see so many students present at the sessions and as ever, their choice of external keynote speakers was excellent. As for number 15, to be held in Washington DC, if they can reintroduce the client voice back into the programme selection, to allow it to become more critical (in a constructive way) then I am cautiously optimistic for the future.  At least they late realised that dropping the developer conference was a mistake 🙂

I’d like to end this review with a challenge for Jay Bhatt and his colleagues at Blackboard. If he really wants to reimagine education and believes that the way to do this is through data-based decisions, then is he willing to move Analytics for Learn into the entry level Learning Core bundle? Giving every Blackboard  user across the world access to powerful, integrated learning analytical tools would be a very strong message. Creating a common platform and millions of users would give the field of learning analytics a real boost, by allowing staff and students to easily exchange interesting questions and patterns. That might just get the slogan off the t-shirts and into our daily teaching and learning…

 

 

When is a cat not just a cat?

When it is being used as an example of digital literacies.

Digital Literacy is a term that is increasingly being bandied about the web. Whilst not (yet?) as misused as digital natives – it is already beating this term three times over in the google matching game:

Over 17 million matches on google, c.f. only 5,620,000 for digital natives

So what does it mean? On the 27th of June, Doug Belshaw will be launching a book which should help you approach/hone your definition, drawing on his Ed D thesis. The abstract of that is refreshingly short:

Digital literacy has been an increasingly-debated and discussed topic since the publication of Paul Gilster’s seminal Digital Literacy in 1997. It is, however, a complex term predicated on previous work in new literacies such as information literacy and computer literacy. To make sense of this complexity and uncertainty I come up with a ‘continuum of ambiguity’ and employ a Pragmatic methodology. This thesis makes three main contributions to the research area. First, I argue that considering a plurality of digital literacies helps avoid some of the problems of endlessly-redefining ‘digital literacy’. Second, I abstract eight essential elements of digital literacies from the research literature which can lead to positive action. Finally, I argue that co-constructing a definition of digital literacies (using the eight essential elements as a guide) is at least as important as the outcome.

CC0 Public Domain Belshaw, D (2011) What is digital literacy? A Pragmatic investigation. Ed D Thesis shared under a CC0 license

Doug’s book builds on this, but is more than just a distillation of these ideas. It has been several years in the making and benefits from the ongoing reflection this has allowed. Its genesis reflects his commitment to open scholarship and shared scholarship.  Draft chapters of the book have been available in advance of the final release, with comments encouraged. Doug used a tapering cost model – the earlier you got involved the lower the purchase cost. The final edition is DRM-free and sharing is permitted: he includes the line ‘You’re welcome to share it with your friends, but please do encourage them to purchase a copy if they find it useful.’
We all have to eat…

The book comprises nine chapters. The first is an introduction, which explains how the remainder of the book is structured and suggests paths through it. Chapter 2 attempts to define the problem this book addresses. Doug explores the ideas of ‘digital’ and ‘literacy’ (in reverse order) and the reader learns to replace the concept of  literacy with  literacies. Chapter 3 stresses the ambiguous nature of such ideas, arguing that this ambiguity should be actively embraced rather than avoided. Models of digital literacies are critiqued in chapter 4, with an alternative – Doug’s eight Essential Elements of Digital Literacies offered in chapter 5. The rest of the book tries to apply this framework. Chapter 6 uses memes as an way to understand digital texts. The next chapter looks at remixes (with a brief nod to copyright) and chapter 8 (perhaps unsurprising given Doug’ s current work at Mozilla) looks at coding the web. The final chapter provides a conclusion and encourages the reader to rip and remix the book.

Doug manages to draw a lot of ideas together in his book – we travel from the invention of the printing press to the World of Warcraft. He blends ideas from academic disciplines – education, linguistics, history, computing, philosophy – with everyday life – gaming, cooking, even furniture.  The result could be a terrible pastiche, but it is not. Doug avoids this by weaving the thread of digital literacies through the book, thus demonstrating the value this lens can provide. Some chapters are flagged as skip this if you like, but I think they are accessible enough and worthy of reading. Some sections  (such as the challenge  to the requirement for linear progress in education) leave you wanting more (note this is not necessarily a criticism).

A real strength of the book are the well-chosen examples used throughout – no technical knowledge is assumed. To  illustrate the potential confusions around copyright Doug uses the concept of recipes (yes, as in cooking) and he derives an enormous amount of meaning from an ugly looking baby* and cheezy cats when analyzing internet memes.

*Apologies to his mother,  flickr user Laney G. Checking her photostream shows said picture to be an uncharacteristic shot. He’s better looking than me, I think I should stop there…

In summary, The Essential Elements of Digital Literacies is a short, informative book written in a clear, often amusing style. If I was being really picky it could probably benefit from one less font, but that is a minor criticism and does not detract from the thoughtfulness of the debate. I think it is one of those books that cannot be read widely enough and I recommend it to anyone. Reading it will not instantly make you  digital literate, but it will give you an understanding of why this is important and offers a framework to help you reflect on your own practice and that of others.