I’m not compulsively tidy, but I do like a place to put things and things in their place. That has been challenged recently, in a very playful way…
Anyone who has come across a Lego Serious Play® practitioner (the need for a registered trademark symbol speaks volumes) will have a yellow duck. Nearly every introduction to the method starts with a little bag like this.
True adherents (such as Ollie Bray from Lego) attribute a lot to this humble construction act.
Having taken part in a range of Lego Serious Play® sessions, I have a small badling of ducks. Until recently they stood quite happily on the top of my bookshelf. That was all to change.
I blame the 2019 Playful Learning Conference. Nothing has ever been quite the same since. I think that fact would really please the organisers. As it should.
It started with someone stealing my ducks and relocating them all round our office – by the kettle, on the stairs, on a window sill. For some reason I took this as a challenge and spent a good 5 or 10 minutes each morning returning my ducks to the safety of the shelf. No-one in the office said anything, or even acknowledged this act.
It was time for action. I couldn’t start every day with a duck hunt! A few ideas came to mind. Make a giant yellow duck – reflected because I didn’t have enough yellow bricks. Use Lego to spell out a message? Again rejected – not enough space on the bookcase for many words. Finally I settled on attaching the ducks to a baseboard and using some of my family Lego to add a small guardian in front to protect them. Job done I thought.
The next day I came in and sure enough the ducks were still stuck to the board, safely on my book case. A few minutes later when I looked out the window I saw another Lego figure. Ignoring the fact that it appeared to be standing on a disassembled duck, I read its banner: Free the Ducks.
Fearing that the white-tack holding it to the window may not be able to resist the unseasonably strong sunshine for long, I opened the window to retrieve it (and look for clues).
Unfortunately, as I opened the window, the duck under the figure crumbled and the whole assembly fell to the ground below. Rushing down the stairs (accompanied by my office-mate Julie) a tragic scene unfolded. The duck was in pieces and the protester was lying face down and not moving or responding to my voice. I could now see the message on the back of their banner: Don’t hate – migrate.
Having no immediate plans to leave the country, I decided not to dwell too much on the latter. I carefully picked up the Lego pieces, carried them inside and back upstairs, before laying them out respectfully on my desk.
One senior management meeting later and when I got back to my desk, the ducks were gone again. My ducks. Gone.
Looking up the answer dawned. Someone – clearly not an animal lover – had stuck them by the feet to the ceiling. How long before the blood drained from their feet and they fell? This was getting crazier and crazier. That said, I was enjoying coming in to work each day and wondering what would happen next. I was also getting knowing smiles from people as I walked around the building.
Then I saw them on Twitter. It was interesting to be painted as the villain, with people feeling the need to free them. Clearly they didn’t understand the difference between kidnapping and rescue, home and away.
I expect this is not the last chapter in this story, especially as we move to a new office next week. It has caused me to pause and reflect on the impact of play upon my attitude to work.
What started out as a very silly, probably impulsive gesture has snowballed. The list of perpetrators collaborators? liberators? players? continues to grow.
It has made work more fun. How often do you get to write that? It hasn’t stopped me from doing my normal job, but it has added a little spice, an edge of unpredictability. It has given me something to think about on the drive in and talk about at the dinner table.
Assuming at least one of the people involved is from my team – and I am pretty sure they are – it also has made me think about how we interact at work. Should I be annoyed or pleased that someone I manage thinks it appropriate to hide my possessions?
I am pleased (although if they started doing it with my laptop or coffee mug I might change my opinion). I’m also looking forward to going into work on Monday and seeing what happens next…
For attendees at BbWorld this year, the question on everyone’s lips was “have Blackboard finally released a version of Ultra that we can use?” Most left with the answer “no” but seemed relatively satisfied. Things are complicated.
The term “Ultra” has become overused, a muddle that is entirely of Blackboard’s own making. (For more details see this attempt by Blackboard’s Lynn Zingraff to shed light on the terminology and the company’s progress delivering it).
From my perspective, in its purest form “Ultra” applies to a re-vamped version of Blackboard Learn running in the SaaS environment on a postgres back-end, featuring a new responsive interface that adopts Blackboard’s New School design language (more on that in a minute). This new product is being trialled by a few schools but for most existing customers it still lacks many of the key features they need (such as comprehensive testing, group functionality, SCORM support, choice of language). As such few deem it ready for use in the production environment.
Where things get confusing is that perhaps in an attempt to appease those waiting, Blackboard talk about the “Ultra experience” when referring to existing components that adopt the same New School design. This includes the current Bb Student mobile app, plus some new ones coming soon (Bb Instructor, Bb Predict and Bb Planner amongst them). New School is also at the heart of the redesigned and much-improved Collaborate tool (see below).
Literally a couple of days before BbWorld, Blackboard released an initial responsive theme for the existing Blackboard Learn 9.1 product – allowing them to claim that they were now bringing Ultra to 9.1 clients – albeit only those customers running the latest version (April/Q2 2016).
I’m going to provide a full review in my next blog post. From my initial testing, it is clearly not the real Ultra experience – being responsive alone does not make this New School. There are some usability issues (colleagues from Keele thought that users might have difficulty restoring a course menu once minimised) but I remain generally positive. Overall, it is definitely a better experience for people accessing the traditional Learn product using the phone or tablet’s native browser.
It is worth noting that this theme is best considered a work in progress – Blackboard have already indicated that they plan further improvements in this area for the Q4/October 2016 release. This is something that clients have been demanding for a long time – to the point that one – Bert Coenen from KU Leuven – had gone as far as building a proof of concept responsive theme in 2013 (see the OSCELOT projects site).
One Learn is Enough for All of Us
In the keynote and subsequent road map presentations, Blackboard staff talked a lot about “1,2,3” – that’s one Blackboard Learn, two experiences (original and Ultra) and three deployment options (self-hosted, managed-hosted and SaaS). I’m not sure that we have got to “one Learn” yet – Blackboard still seems to be supporting a ridiculously broad portfolio of solutions, versions and environments. They were very clear that they were not announcing the death of 9.1 – they remain committed to the product and its development.
Given this failure to really deliver “Ultra for all”, you could be forgiven for assuming that the senior management were berated by users. That wasn’t the case. The reason for this, I think, is the new CEO – Bill Ballhaus. He has a very different presence to his predecessors – none of the swagger of Michael Chasen, none of the “trust me I am/was a teacher” of Jay Bhatt. There may have been less passion in the BbWorld keynote this year, but there was a greater air of professionalism. People are willing him and the company to deliver. Bill stressed his background in engineering – a focus on quality and planning were essential to get something into space. He wanted to bring that same approach and focus to Blackboard. He understood what Blackboard was about – producing software that would help students succeed. It made a refreshing change from the arrogance of previous mission statements: “to continually reimagine education for a new generation of learners and teachers”. It’s not on the same scale as Shuttle failures, but the recent hosting problems experienced by UC Davis with their Sakai system reminded many just how business-critical these systems are to higher education.
Another common feature of Blackboard’s presentation was the use of little vignettes, seeing a problem from the perspective of one or more people. This was often done with a sense of fun – learning about ‘Jack and Diane’ (any John Mellencamp fans out there?) or from ‘Bert and Ernie’ (a tribute to Sesame Street) during the Mobile Roadmap.
Why? Well it is evidence of Blackboard’s increased use of the design thinking paradigm. This video (shared by Eric Kunnen) explains IBM’s take on design thinking – made even more relevant given the strong IBM presence at the conference following Blackboard’s (US) web-hosting partnership with both big blue and AWS.
In short IBM see the design thinking process as understand people’s needs; exploring solutions to meet those needs; prototyping solutions and evaluating the results. As such it shares many traits of agile development. The key is to spend time talking to people to understand their needs, identifying the key “pain points” in existing solutions.
The design thinking outputs from all Blackboard’s empathy mapping and creativity is a series of persona based applications. ‘Jack’ the time-strapped student may use Bb Student, whilst on the go lecturer ‘Diane’ will use Bb Instructor to run webinars even when away from her desk.
Whilst admiral at many levels, such an approach risks technological determinism, or at least a belief that things must be changing ergo we need a technological solution. I wonder just how universally applicable this approach is: I wouldn’t just talk to the inmates if I was going to design a prison! Less glibly, can this approach differentiate between the students needs and wants? Will it help deliver a better education, or just add to the existing consumerism?
At the conference one “key insight” from Blackboard’s conversations with students was shared. “Students have a short-term, reactive approach to deadlines & due dates.” This attracted derision from some delegates “tell me something I don’t know” and worried me.
The design solution offered – the activity stream – comes from Blackboard’s “fluid, intuitive, fun-loving”* New School design language. The question is, by placing this stream first and foremost when you open the Bb Student app or a course on Ultra (SaaS) are we not engineering dependency, reinforcing an ineffective approach, rather than trying to solve the underlying issue – be it that staff need to communicate deadlines more clearly, or that students need to develop better time management skills?
Whose app is it anyway?
When describing New School, Blackboard staff stress the value of the aesthetic. They see their task as “To help shift the overall gestalt of educational software, from enterprise features to emotionally supportive solutions.” This quote shows what difference they think this will make:
“The traditional way of building software focused on requirements, driven by a competitive view of the market, and responded to perceived needs with additional features and functions. Our new way of developing products focuses on people first: on spending lots of time with them in order to build an almost intuitive sense for their emotional journey. We design with that journey and those emotions in mind, and as a result, we can product great products that people love.”
I am pleased to see that so much thought and planning has gone into the appearance and user experience of the new tools. I think it is paying off as we can see the emergence of a consistent “Blackboard” user experience. One question I have is who are students and staff thinking of when they interact with Bb Student or Blackboard Learn? Is it the software vendor or the institution who chose it? Who should it reflect?
I also sign up (at least in part) to the “less is more” mantra. The current version of Learn contains lots of clutter and developmental dead-ends (e.g. the un-loved Tasks feature) and I would happily see these removed. There is a limit to this action though – minimalists take note, you still need a functional product that meets your basic needs. An example of a gap only recently filled is the ability to access discussion boards via the Bb Student app – only added in June 2016. It had always puzzled me why it had taken so long to implement this common feature. It turns out that Blackboard were wrestling with how to display nested discussions with comments on a mobile device (a fight that escaped the launch of Mobile Learn which has supported them for ages). Admittedly displaying multiple nested lists quickly eats into the space available to display the message. The answer is only to display one level of nesting and add an icon for comments. Sounds sensible, but why did it take so long?
Down the line
There were a number of other new or improved products discussed at BbWorld. Analytics for Learn (henceforth to be known as Intelligence for Learn) showed some promise – John Whitmer spoke animatedly about it’s potential. I was probably most impressed with the changes planned for Blackboard Collaborate. Here at least we are seeing Blackboard thinking like a company that understands the needs of educators. On the very near roadmap are indicators of the progress of file uploads. So after uploading your presentation into Collaborate, you can see if everyone has air least got slide 1 displayed on their screen before you start talking. There are thoughtful inclusivity features – the ability to live caption and provide a locked video feed showing people signing. Linking back to analytics there is also discussions about ways to measure the effect of webinars on students – recording user metrics and possibly a real-time engagement score. If they can get this right then they establish a clear lead over generic messaging apps such as Skype and Google Hangouts.
This is the next persona-based app, filling in the current gap in offerings for staff (Bb Student currently only shows courses where you are enrolled as Students) though I was promised that this may change and that there may even finally be a presence on Bb Student for a course’s shameful sibling – the organisation.
Bb Instructor is aimed at the tablet format and combines the existing marking functions released in Bb Grader supplemented by a new mobile interface for Collaborate exposes Moderator functions. Later releases should add more course planning and class management features, finally exposing some Edit functionality!
This product according to the marketing speak displayed at the conference is all about “empowering students to take control of their educational journey”. Blackboard seem to be taking what they have learned from users of MyEdu and their own Bb Labs Job Genie product and combined them in this tool. Bb Planner prompts students to identify their goals and interests then based on these it suggests relevant programmes of study and careers. In the demo I saw, students were shown the likely salary and current demand for their chosen career, the training requirements and relevance of the courses that they had completed to date. They were able to select suitable courses at the institution and enrol upon them! The app uses data from external sources – jobs from Burning Glass and videos where people reflect on their careers from Roadtrip Nation.
Blackboard are not planning to release this tool in the UK as yet. In truth it will need more work before it can be widely used across the US. Development to date has been with one institution – Northeast Wisconsin Technical College – leaving many pundits asking how transferable/representative their processes are to others? I also wonder what the user experience will be when the app tries connecting to the student record and timetabling systems of large institutions? I think the response times might be greater than we saw in the demo! It is an admirable aim to avoid the confusion and queuing associated with such choices, but I am not sure where this sites with regard to institutional priorities. If course/programme selection is only a hot issue for a couple of days at the start of each academic year – how much will institutions be willing to invest in time and money to fix it?
The first thing to note is that this product is not called “Bb Advise“. I think that’s because it is not a mobile app, so is given the full company name. This tool is aimed at staff and designed to help them understand students’ goals, interests and aspirations – helping to inform face to face contact. Blackboard also see it playing a role supporting academic and career planning and also on the pastoral side, with dashboards designed to help staff spot and contact at-risk students. We weren’t given a demo of this – just a few screenshots.
This tool is based on the predictive technology developed by Mike Sharkey and colleagues at his startup Blue Canary which Blackboard acquired. From the name you can surmise that it is not designed for use on a small mobile device. It comes in three parts: first the predictive model that uses an institution’s existing data to define measures that can identify students at risk; It then displays these data on a series of dashboards targeted at students, lecturers, and advisors; Finally it provides a communication tool to allow teaching staff to pass on potential alerts to colleagues in counselling and other professional support services, to deal with any problems early on. I don’t think this is targeted for release outside the US in the immediate future.
Supporting student success
Blackboard appear keen to position themselves beside the student throughout their learning journey – selecting courses, managing their finances, choosing careers as well as the traditional learning and teaching activity. This raises more questions than answer: Will they be welcomed? Will institutions want to have so many of their core processes depending on a single supplier? How will others players in this arena react? How will this attempt at standardisation fit with the various existing frameworks and bureaucratic processes?
As a company their products seem to sit better together, offering a more joined up experience. I think thats why so many Blackboard staff look happier this year than they have for the previous few. I think that’s a very good sign as they employ some very clever people. Hopefully we have seen the end of “lab” projects that rely on Facebook authorisation when institutions already have their own established perfectly good authentication and authorisation solutions.
I think Blackboard can look back on BbWorld16 as a success, but should note that the pressure to deliver a true Ultra experience for Learn has only intensified. In July 2017 the audience in New Orleans will be expecting more than just a responsive-ish theme to take back home. Will Bill be able to deliver again?
This post’s title is a tribute to the highest point on a road in Scotland, where weary travellers would pause for breath (see this site). Today I am reflecting on the long journey that Blackboard has taken in its use of APIs and interface hooks and what the release of the new REST API means for developers.
One of the strengths of Blackboard’s Learn product has been the fact that developers could extend it using Blackboard’s Java APIs. This has led to a host of third party tools (including many free, open source solutions) which add new functionality or address gaps in the interface (see the building blocks catalogue and the OSCELOT projects site). This has not been the easiest space to work in – despite repeated requests, many of the APIs remain undocumented or flagged as private – even though there is no other way to achieve the outcomes. Worse, the underlying data model suffers from an early decision to separate functions exposed in the user interface, from more administrative features (such as managing users, courses and enrolment records via a feed a student record system). This means that there are actually two classes representing most actors – e.g. blackboard.data.user.User and blackboard.admin.data.user.Person. The associated loaders use different parameters making the whole process more complicated/tedious than it needs to be. Extensions using these APIs and the associated tag libraries result in a tightly integrated solution that looks and feels just like the rest of Blackboard. That’s good, but as the code runs on the Blackboard server, there is a risk attached. As we see Blackboard’s preferred architecture moving from a self-hosted model to the cloud (in some cases adopting a multi-tenancy approach) then the ability to alter the core code becomes less attractive to the vendor.
Blackboard’s second foray into this area was to expose a series of SOAP web services. Whilst used by some clients, they are clunky and not popular with developers. They too suffer from incomplete documentation. Indeed they are so tardy that one developer (Andrew Martin from Newcastle) actually wrote alternative SOAP and REST frameworks, earning a Blackboard Developer’s Award in 2008.
Fast forward to 2016 and Blackboard finally unveiled their long-awaited REST APIs. Or at least the first few (see https://developer.blackboard.com/portal/displayApi). Initial impressions are good. They are using Swagger to document them, making it much easier to get started. They have also given a lot of thought to the end points with GET(get), POST (create), PATCH (update) and DELETE (delete) methods used consistently. As a slight aside, Blackboard are working on a similar set of REST APIs for their Collaborate product. That’s great but it looks like they were developed without talking to the Learn folks as they use slightly different method names (e.g. PUT rather than PATCH). Hopefully that discrepancy can be resolved before they are made public.
Calls use an OAuth token to authenticate. Each web application requires a unique key and secret, which the developer registers via Blackboard’s new Developer portal and then a Blackboard sys admin registers on their system. In this first implementation each app just has one key, I think it is likely that in later iterations it will be possible for developers to generate a unique key for each institution that chooses to deploy the app, allowing greater granularity of access control and reducing the risk of a man in the middle attack.
Blackboard’s Scott Hurrey and Mark O’Neil were both very keen for attendees to use the new API. To help, they have developed a series of examples on GitHub in a range of languages, including Python, Java, C Sharp and Ruby. The REST API is now included in the latest Blackboard Learn release (2016 Q2 – albeit flagged as ‘technical preview’) and in their Vagrant-based Developer Virtual Machine.
What does the mean for developers? The REST examples released to date are all external tools – designed to pull data from Blackboard Learn, possibly manipulating it and then pushing it back. This seems to be part of a co-ordinated move to “encourage” third party code to stay off the core Blackboard servers. Whilst sensible enough for commercial integrations, this is not such good news for in-house solutions or people developing extensions designed to run completely within Blackboard.
The new Blackboard Ultra lacks many of the extension points/hooks available to developers in previous versions of Learn (e.g. it won’t let you define new course control panel tools). There was one session suggesting that you could craft special LTI links using template variables and custom parameters which point at a building block to get round this (LTI is still supported in Ultra) but this is at best a kludge. Even uploading code as a building block will require assistance from Blackboard.
In summary there is a lot to be thankful for if a set of REST APIs will meet your needs. These appear better thought out and better documented than ever before. If however your needs can only be met by code running Blackboard server-side, then things might get trickier in the future. It seems flexibility is being sacrificed for stability and security. Only time will tell how wise a move this is.
The “new” moniker so beloved of Labour in the Tony Blair era has made a reappearance, this time at BbWorld 2015 in Washington DC. It came this time from Jay Bhatt, the latest President and CEO of Blackboard. We are now given “New Blackboard”:
Not content with that, he’s also giving us NLE – the “New Learning Experience”. (Wisely avoiding the XP option).
So what’s it all about? Well Jay Bhatt has identified five key areas:
Intriguingly, and almost as an afterthought in his keynote he stressed that integrated workflows might be the most important of these. Really? He talked passionately about the work that Blackboard have been doing, re-engineering their entire portfolio from the ground up (though that isn’t news, the same has been said in Vegas, and in Vegas before that).
There was also a strong attempt to show that Blackboard is putting the learner centre stage, with enormous pictures of children dwarfing the presenter on stage:
So what was new then? How will all this investment in technology help improve education? Jay is calling for a revolution, claiming that Education is in cris. Re-imagine Education everyone. So let’s see what he’s offering…
We were shown Blackboard Learn sporting the new Ultra interface. Again. Yes it is still pretty. Learn finally looks like it is using code that was written after 2010. But does it surface or bury the information learners need? I worry that it is offering content without context in the stream of new items and latest posts. Where has the scaffolding and advice from staff gone? If the underlying data model and database structure hasn’t changed, is it more than lipstick on a (very fat) pig? Time will tell. Oh and they will still host Moodle for you too.
Not exactly new. This HTML5 video experience has been showcased at the previous two BbWorlds and yet still hasn’t achieved feature parity with the old java version – no recording yet, no integration with Blackboard (ironic given the later claims of integration being the key).
Again, how is this new? This was basically a repeat of what was said the previous year. In fact this presentation seemed to cause more confusion than light, with many commentators on Twitter wondering just when the much vaunted “Ultra” interface (that is to premier in the SAS offering) will filter down to self-hosted clients, indeed if it will make it at all.
The app discussed last year is now available to some people. New?
You want it? Blackboard will provide it for you, at a keen price. Is this news?
Essentially joining things up, Blackboard have been threatening to do this for years. One day they might actually manage it, but anyone whose looked at the admin interface of the current Collaborate product will know that there is still some way to go on this. Too many of the products lack parity – no analytics data from Blackboard Mobile being an obvious example. This is an admirable aim. As it was in Las Vegas in 2013. That doesn’t make it news.
This was probably his strongest card. Providing integrated analytical data for all staff and students could help us to understand more about the way we learn. Sadly this is likely to be an optional extra, rather than a core offering available to all Blackboard clients.
He looked at the learner’s experience, stressing the life long nature of learning. Most relevant to me was the example of a higher education student:
We were then shown the workflows:
and guess what, Blackboard has solutions for all of these:
News? I don’t think so.
There were a lot of promises and repackaging of things already delivered. Perhaps my initial political analogy wasn’t so far off the mark after all. Come on Blackboard! There are a lot of bright, motivated people in the company, many of whom have spent time in education. You can do better and will need to do better if you are to deliver products that staff and students want to use and institutions are prepared to buy.
This post documents my investigation into getting SCORM packaged content to play in Blackboard and looking at what we can learn about the learner’s interactions via the Grade Centre.
There is often a need for a simple way to insert learning materials created outside the LMS (Blackboard, Moodle, Sakai, etc.) into it. These could be materials produced by a third party – perhaps supporting a published text book, or a custom training course. It is at this point that people usually start talking about SCORM.
What is SCORM?
SCORM (the Sharable Content Object Reference Model) although not a standard (sensuIMS, etc.) is the nearest thing there is to one. Whilst newer formats such as Tin Can (also called the Experience API) offer more possibilities, at the time of writing very few LMS vendors have implemented these sufficiently to make them any more functional than the latest flavours of SCORM. The list of Tin Can adopters published by Rustici is impressive, but may perhaps be best seen at present as a list of those working to implement it fully (see for example these posts regarding Sakai, Blackboard and Moodle). Apologies in advance if there are more recent developments that I am not aware of!
The trouble with SCORM is that it is not a single thing. In fact it is a wrapper around a series of existing formats, and you can meet this reference model in several different ways. Most people using SCORM today use one of these “flavours”:
SCORM 2004 – which is available in 4 editions, edition 4 (2009) being the latest.
Rustici have published a very useful set of SCORM resources. In this post I am going to test some custom content created in Adobe Captivate and export it using a range of SCORM flavours, importing the resulting file into Blackboard. I will then look at the reporting options available and how these measure up to those for a standard test/quiz. Technical Information
Adobe Captivate 22.214.171.1246 running on a MacBook Pro with OS X Yosemite
Blackboard Learn 9.1 October 2014 release (CU2)
Building Block: Rustici SCORM Engine 2013.4.1164954*
* Blackboard released a new version of this Building Block as I was part way through these tests. I have learnt the hard way not to apply these too quickly.
I created a simple object (which I called Vanilla) which uses a range of different question types. Some of these are drawn from pools, some are hard-coded into the object. In any run through, the user will be asked to answer eight questions, each worth 10 points:
Multi-choice (single answer) question – 1 drawn at random from a pool of three questions
Fill in the blanks question
Matching question (three heads to three tails)
Image hotspot question
Ordering question (arrange the 7 steps of a process in the correct order)
Multi-choice (single answer) question – 1 drawn at random from a pool of three questions
As 7, with a different question drawn from the same pool.
Before exporting anything, you need to define how a person passes the test. This is defined in the project Preferences. File | Project Info and then click on the Pass or Fail option down the LHS:
You can see here that we have set the pass rate to be 80% (pretty high). There is also the option to choose a numerical score as the threshold. The other options (relating to what happens if you fail and the ability to retake the quiz can be ignored for the purposes of this test). The next step – deciding what to report back differs depending on the SCORM flavour you use…
A SCORM object can report (to the LMS, usually via the Grade Centre or equivalent) different measures, depending both on the SCORM flavour you use and the way you set it up. Typical fields it can report are:
The Completion status (how many of the pages of content you actually viewed). When a SCORM course is launched, the status changes to Incomplete. It changes to Complete when the user meets certain criteria – the Completion Criteria.
The Score of any quizzes taken
The score and/or the completion status can also be used to derive a Pass/Fail flag
Ideally, you also want it to provide details of the Interaction – how long people spent on each page, which questions they attempted (remember these may be drawn at random from a poll) and the answers they provided.
In the simplest case, the Completion Criteria might be just that the user has launched the package. It is more commonly defined as either viewing a certain number/percentage of slides and/or achieving a pass score in the quiz. In order to export content as SCORM you need to adjust the project Preferences. File | Project Info and then click on the Reporting option down the LHS:
If you select SCORM 1.2 under the Standard select control the rest of the page looks like this:
Note that you only have the option to export the 3rd Edition flavour of SCORM 1.2 in Captivate. It also only supports a single SCO (Sharable Content Object) within a project (actually the entire project), even though SCORM itself allows several. This restricts the reporting options, although in most cases a single SCO is adequate. The default Advanced settings are used in these tests:
If we select SCORM 2004 under the Standard, the page changes slightly: The Status Representation options seen in SCORM 1.2 disappears and instead we have the option to define separate Completion and Success criteria. This allows for the same level of reporting as the earlier incomplete/fail/pass option: incomplete = Completion criteria not met, Success criteria not met; fail = Completion criteria met but Success criteria not met; pass = both Completion and Success criteria met. You can define both criteria using test scores, number of slides viewed or just opening the package. In my case I have opted to define completion as viewing a certain number of slides and success as passing the test. Again there is the option to report the Interaction Data. If you click Configure, the dialogue displayed is similar to SCORM 1.2 but not that the user has the option to select their preferred version of SCORM 2004 – one of the 2nd, 3rd and 4th edition.
I chose to use the latest one. We left the Advanced settings at defaults (as with SCORM 1.2). For testing I published three SCORM 2004 versions, one for each edition available in Adobe Captivate.
Publishing the Content as a Zip File
Regardless of SCORM flavour, the process of publishing the content is the same: File | Publish…
Note that the option to Zip Files is ticked. In summary, at the end of the tests I had four SCORM packages:
SCORM 1.2 3rd Edition
SCORM 2004 2nd Edition
SCORM 2004 3rd Edition
SCORM 2004 4th Edition
The content was added to a new Blackboard course as a Content Package (SCORM) – this is the custom content type defined by Rustici’s building block:
You are prompted for the location of the zip file you published:
after selecting the file, I clicked Submit to upload it to Blackboard where the manifest file is parsed. The first part of the form reads the descriptors from the manifest and asks you to confirm the details displayed to students:
Note the fact that the Title and Description values are taken from the manifest using the values you typed into the Configure box above. The second section determines how users can interact. After thanking the developer for selecting “No” as the default option for Make SCORM Available (can never get enough radio button clicks) then it’s down to the attempts. For these tests I am going to allow an unlimited number, without any date restrictions. There is no sense in turning tracking on as you will get much better data from the SCORM object itself when you open the Grade Centre.
The third section is where you define the integration with the Grade Centre (unless you choose the option No Grading). Note that even if the content doesn’t have a quiz baked into it, you could set a simple score related to completion that would indicate the people who had worked through it. Blackboard users note, the Title setting here is what is used to name the Grade Centre column. Given the multiple flavours of SCORM that I am testing, I will have to be careful here and use one of these names each time:
Keeping them short is a vain attempt to reduce the scrolling in the Grade Centre. I’m selecting to use the SCORM Score as the Grade, so I will see how well people have done in the quiz. The Grade SCOS (which should read Grade SCOs) section can be left at No as this captivate content can only ever have one shareable content object, so there is no need to provide separate scores for each part. I am leaving Grade Timing at showing the Grade of Last SCORM Attempt (though you can get them all via the Grade History).
Finally clicking the Submit button creates an entry in Blackboard
When you click the link, the chances are out the box it will not behave as you wish. In my case, it opens as a new popup window that is not quite the right size:
To sort this, you need to edit the content again in Blackboard and make use of a feature that only appears after initially submitting it and does not conform to the Blackboard UI style manual (oh yes there is one!) Below the SCORM AVAILABILITY entry there is now a new entry ADVANCED OPTIONS:
If you change the radio button to Yes, a whole heap of extra options are displayed
The first set relate to the Navigation Controls – these can be important if you want to show/hide the course structure down the left hand side, or display a Close button at the top of the screen:
Clicking any go the other text below the radio button, e.g. Launch Behaviour gives you more options.
This is where you can set the size of the opening window. The page helpfully highlights any changes in a pretty shade of rose pink:
That’s it sorted now.
I then repeated the process for the three SCORM 2004 flavours.
Note that as far as configuration within Blackboard goes, in this case the same settings can be applied to all four flavours.
Note, later I explored these settings in more detail. I came across a variable that sets the SCORM edition under Compatibility Settings:
I was not impressed that this was not set correctly by default for the SCORM 2004 items – it seems to default to the 2nd Edition 😦
On the up side, changing this did not seem to make any difference to the content item’s behaviour.
The SCORM 1.2 did not have this setting, so at least it detected that correctly!
Completing the Tests
I then logged in as a student and completed the four SCORM objects. You could not tell the difference from the user perspective.
I will come back and look at the experience using a range of devices (laptop browser, mobile browser and mobile app).
Logging back in as an instructor, I can see the entries for each SCORM item in the Grade Centre:
To see the detail of any given attempt, you have to select it using the context menu at the right hand side of the entry:
To cut to the chase, there is a difference in the data displayed from SCORM 1.2 and SCORM 2004 items, but no difference between the various SCORM 2004 flavours:
The key advantage of the SCORM 2004 format is that you see the text of the answer selected by the user, rather than just an identifier (e.g. a, b, etc.).
The utility of the response varies by question type. For text responses it is easy to see which response the learner chose and whether it was correct, but for some (e.g. hotspots, where just the co-ordinates are displayed) or ordering (where you need to parse (URLDecode) the entries selected) you need to pay much closer attention.
It is interesting to see how long a student spends on each question.
That’s about as good as it gets though. There is no overview available across all the students on the course, no ability to analyse the effectiveness (discriminatory power) of the individual questions.
Compare the results above, with the sort of analysis that comes out the box for the much-derided Blackboard Quiz:
From the Control Panel in a Course: Course Tools | Tests, Surveys and Pools | Tests
and from the context menu next to the corresponding Test choose Item Analysis
Running the analysis can take some time (especially if the class contains a lot of students) but the results are worthwhile.
In the screenshots below I am displaying Item Analysis results for the same questions used in the SCORM package, but deployed as a Blackboard Test.
The Item Analysis starts with a summary of the test, how many people have taken it, and an indication of how individual questions score in terms of difficulty (the percentage of students who answered it correctly) and how discriminatory (the ability to discriminate between students who know the subject matter and those who don’t based on the response to this question) they are.
Below the table is a list showing the results for each question in detail. You can filter this, e.g. to show just the questions with a poor discrimination value:
After applying several filters to get a feel for the data, you can drill into each question to see the statistics behind these measures:
In this table we see columns of figures. The first shows the breakdown of responses across all learners. E.g. 3.87% of all the people who worked through this material thought (wrongly) that Vanilla translates as “Nectar of the gods”.
The remaining columns show the responses selected for four groups of learners, grouped by their final score. So we can see that none of the people whose final score placed them in the top 25% (Q1) chose the option “Nectar of the Gods’, whilst 15 of the people in the bottom 50-75% (Q3) selected it, and 6 of those in the lowest 25% (Q4) chose it. This information is very useful when you come to review the questions.
It would be great if this information was available from SCORM content too, but at least in the Blackboard implementation, this information is not available.
I think SCORM can be used to deploy custom content, to convey information, introduce new ideas and help students assimilate this information into their own mental model. If you are only interested in the final numerical result, then it makes sense to include a test within the SCORM content. SCORM 2004 provides better reporting than SCORM 1.2.
If, however, you want to be able to refine the questions and possibly tune the training materials based on repeated wrong answers, then I think at present it is better to decouple the quiz from the SCORM object and deploy it as a separate stand alone test (possibly only visible once the learner has completed the SCORM packaged content at least once). This also gives you more flexibility should you want students to retake the test later and possibly revisit the materials if they fail the test the second time.
I also wanted to test the way the content was deployed using a range of configurations. The Captivate project used a responsive template, so the content uses one of three defined layouts, with break points defined at display widths of 1024 pixels (desktop), 768 pixels (tablet) and 360 pixels (mobile):
To test this I launched the course using a laptop and a tablet. I considered using a mobile phone too, but I think this form factor is really too small for much e-learning content. For the laptop I wanted to see the effect of resizing the window. For the mobile devices, I wanted to see if it reacted to a change in orientation.
The link appears as a normal content item:
Clicking the link causes a new window to open. The SCORM content plays in this, filling the screen. The content of the window underneath is repainted to show that the SCORM player has launched in a new window.
When you have worked through the materials and close the SCORM player (And so it’s window) , there is a short delay, then the remaining window is updated, returning you to the content items again.
I then tried resizing the popup window to see how the content layout responded.
When I started narrowing the window width, the content did resize to a layout more like the Tablet option set in Captivate:
When I continued to reduce the width, it eventually flipped to the Mobile layout:
As such, although with a few layout issues, the SCORM content is acting in a responsive manner when viewed on a desktop/laptop.
This test was carried out using an iPad, accessing Blackboard using the built in Safari browser:
Things didn’t quite go as expected when I clicked the link to open a SCORM item. Instead of a new window I got this:
If I then clicked the Launch Course button the player is launched.
Unlike the desktop browser though, the opening screen is intercepted with this standard Captivate screen designed to provide help for mobile users:
If you click on the Gesture icon, these instructions are displayed:
Click the screen again to dismiss this information, then click on the player icon to finally launch the content:
If you rotate the tablet, the content is flows to (at least partially support) the portrait orientation:
When you have finished working through the content and close the SCORM object, you are returned to the course as before.
The popup blocker annoyed me, so I went to the Settings for the iPad and searched for the options for Safari. Sure enough there was a Block Pop-ups option:
With this set duly to OFF I tried launching the content again. I expected the file to open, but no, there was a further surprise. Now I was presented with a dialog box asking me to conform the popup.
After clicking Allow the content launched as before, showing the same special Captivate mobile screen:
When the content is displayed, it is laid out respecting the orientation of the device:
and if turned:
Thus, with a bit of persistence, it is possible to play SCORM content using a browser on a tablet (or phone if you have a very high resolution screen and good eyesight).
The final tests used Blackboard’s Mobile Learn app.
The content is accessed by navigating a series of nested menus:
When you click on one of the links to the SCORM packages, the app does not know how to handle the content, so you are palmed off to the built in browser:
but eventually you get this
If I click the Launch Content button the result is again, not what I expected:
If you repeat this cycle and click fast enough, you can end up at this page:
I also tried updating the launch settings in Blackboard to do everything I could to avoid the popup:
Result: still no luck with the Mobile App.
As such I have to flag SCORM content as incompatible with Blackboard’s Mobile Learn app at present.
When my son came home from school with a list of one hundred words to learn (thanks Michael Gove) I wondered if some technology could help. Somewhat hesitantly I started searching Apple’s App Store hoping to find something that wasn’t tied to a US-English dictionary. My search turned up a range of apps, the one I settled on was Super Speller by a husband and wife software team – Quiet Spark. One of the reasons for this was their sensible approach to privacy and an absence of adds (well worth paying £1.99/US $1.99 for).
Don’t let the clean interface of this app fool you into thinking it is too basic. It is deceptively powerful. Essentially you create a series of tests by typing in words, then use the iPad or iPhone’s microphone to record yourself saying them. That means your children will hear the words spoken in the local accent. So far so good…
The list supplied by (to?) the school doesn’t just contain words that are tricky to spell (like achieve or rhythm) it also contains words that sound alike. The question that initially troubled me was how can you use an auditory cue to help the listener differentiate between the potential responses? That’s where (with a bit of lateral thinking) this app excels. Rather than just saying the word and stopping, you can follow it with an explanation, e.g. recording the phrase “aloud – as in speaking out loud” or “allowed – as in permission to do something”. This way the meaning of the word as well as its spelling can be reinforced each time the test is taken. Equally, you could include it in a sentence and say something like “Spell allowed, as in ‘you are not allowed to pick your nose'”.
Once a test has been set up, there are a range of delivery options. Most are what you expect – the ability to shuffle the order, ignore capitalisation and, if you really feel the need, to set a time limit. Something that isn’t part of the enterprise testing solutions I am used to (think QuestionMark, Blackboard, Moodle, etc.) but perhaps should be is Super Speller’s Smiley hints option. Essentially this feature adds a Smiley at the top of the screen that provides the user with regular clues whether or not their spelling of the word is on track. This is particularly useful when learning a new list of words. Whilst helpful, achieving full marks in a test using this feature means you miss out on the reward offered under “full test conditions” – a screen full of balloons to pop.
The app also offers a Study the Test mode, where a link is added exposing the iPad’s dictionary. Just remember to set up the appropriate language for your iPad and enable/disable the dictionaries before hand! The app will honour these settings – an important feature as it should be your teacher, not the device that has the last word in how a word is spelled.
Unlike some apps designed for mobile devices, this one supports multiple students, making it great for families who have chosen not to issue everyone with their own device. Before you take a test, you are prompted to enter your name, and the results are saved against your name.
Often it can help to add a few words not on the test list. This doesn’t need to be an attempt to trip them up – inserting the name of a favourite toy or TV character can add a bit of light relief and remind them that learning should be fun!
The app has a lock option you can use to prevent access to the Manage Test (a.k.a. the See the Answers) page. Whilst locking it down might initially seem appealing to parents, if you leave the app unlocked, then children can have fun making up their own tests, challenging each other (and who knows, even their parents!) Creating extra tests has proved much more of a draw to my children than the built in word search and scrambling tools (though your results may vary!) It also provides some insight into the breadth of their current vocabulary and a chance to pick up any misunderstandings or mispronunciation early on.
It provides good reporting tools if you want to check on your children’s progress – you can step through the responses in each attempt. I’ve yet to explore the tools for sharing tests with others via email, but I can see the advantage, particularly if I was a teacher wanting to use this for practice in my class.
This app was written by parents to help their own child and I think this focus on making it appealing to children is the key to its success. Only time will tell whether the balloon popping will retain its appeal with my children, but Super Speller has already proven to be a good way of getting them to complete their literacy homework. If I could change one thing, I’d like to add the ability to record an introductory or congratulatory video clip for a test, to make it feel even more personal.
She began with an apology – about the US-centric nature of her writing. She then talked about ed-tech as a route for a new US cultural imperialism. (She didn’t use these words, but I think this process could offer an alternative, darker (re)definition of Euan’s Semple’s catchily named trojan mouse concept). She then had a kick at Blackboard, and another, and another, and also all the ed-tech startups/wanabees who think “Blackboard sucks” but then essentially want to create another, but skinned differently. To over-mix my metaphors, a Blackboard in Facebook/Coursera/any old sheep’s clothing still sucks, even if the potential market and spending record of schools and universities on such systems has investors drooling at the mouth. She was exceptionally critical of the lock-in of data and ideas that a VLE facilitates, fenced off from the outside world (nicely illustrated with slides of cows looking at you across barbed wire).
Yet this walled garden did not suddenly come about when institutions signed up in droves to buy VLEs. There have been fences around schools since at least Victorian times. The reasons for these remain complex – is it protection of identity, income, reputation? Is it thought to promote the rarefied atmosphere ‘required’ for learning – i.e. to keep others out? iThis self-imposed fencing was explored further by questions from the audience – the internet is still being portrayed as something to be protected from – witness the 4 page acceptable internet-use agreement my 9 year old son son and I had to sign for at the start of his year 5* class at school earlier this month. It could be argued that we get the solutions we pay for, and these learning management focussed systems dovetailed neatly with the needs of institutional managers. Her point is though, that the learners and teachers had little say in this.
She talked about the danger of storing things in the cloud – witness the recent iCloud password hack ‘exposing’ [pun intended] celebrity photos – and stressed the importance of owning your own data. She then talked of a different approach taken by the University of Mary Washington – their Domain of One’s Own initiative – where students and staff are bought their own domain (whose name they can negotiate) and helped to set up LAMP tools such as a blog. An interesting idea and a very brave marketing strategy – note the equal number of dislikes as likes on their introductory video. I’m guessing from the abandoned Bagman blog that this approach (be it marketting or DIY infrastructure) wasn’t to everyone’s taste. This hands on, take control of your data approach is one that resonates with Audrey.
It has a sense of coming full circle. It was reminiscent of the early web publishing activities of staff and students in the time before VLEs – Audreys uses her own graduate teaching at the University of Oregon in the late nineties as a case study. This was interesting and resonated with my own early teaching experience. I was also a member of what we could term ‘Generation tilde‘ – those who had public web space on their University’s servers, accessible by simply appending ~ and your username to the institution’s domain. We were certainly much freer to publish content than we are now, but I think we suffered from the lack of data about what people were doing on our pages and few had the skills to code online tests, discussion boards, let alone provide tools where students could begin to construct and challenge their understanding online together. The web was freer, but it was also a lonelier place then.
Reflecting on her talk as I travelled home, I couldn’t help feeling that she is on to something, that somehow we need to improve the base level of digital literacy in the population and heighten awareness of where our data is held and how valuable it is. I loved her quote “data is the new oil”. I also loved another version that Doug Belshaw had heard (apologies I remembered the quote but not the source) – “data is the new soil” – I think that neatly captures the fact that this the data are the beginning not the end-point. I am still wrestling with the inherent tension between the desire to be open and the need for private spaces to learn. I think it might be easier for professionals (e.g. teachers) to share materials and if possible the journey (including any wrong turns), all subject to continual refinement and reflection. Martin Weller and Gráinne Conole are a good examples of this from HE. Yet I think we also owe it to our students to provide them a ‘safe place to fail’ – somewhere to experiment, try different approaches and angles, without worrying that these actions will haunt them online through the rest of their life. If we ask/require students to make their learning public, can we predict the effects? I am worried that such an approach may have a negative effect on learners with low self-esteem, the slower thinkers, those still struggling with the subject, or trying to consider things from an alternative perspective. Would it promote an attitude of playing safe, favouring the students who are first with the most obvious answer, reinforcing the actions of the loudest, or playing to the audience? Yet how many people would really read students assignments? Shouldn’t I also draw hope from the fact that people seem to find the courage to post the most remarkable things on Facebook (or perhaps that is exactly what I should be worrying about). The web allows does allow you to go back and update your content [if you own the data], so am I just overly-paranoid?
Audrey pointed out that at least some of these issues can be avoided through the use better assessments and of pseudo-anonymity – e.g. choosing the name of your blog and domain with care. Doug Belshaw provided a great counter-example of a UK student who was working on a history project blog about native American Indians. His initial postings were not too great, but his enthusiasm for the topic was fired up when out of the blue a comment was posted from the son of an Amerindian chief (I hope I got that right Doug!). That’s the way I want things to work. Perhaps it’s time to rethink that fence…
* That’s the equivalent of primary six for anyone in a sensibly numbered education system – is it any wonder many children find maths confusing if we can’t even apply the most basic principles of arithmetic to the year numbering?