Thursday, December 06, 2012

What Problem Are You Trying To Solve?


A few weeks back, I had a productive discussion with Aaron Silvers.  I needed an outside eye to provide feedback on my initial attempts at a learning architecture.

While we were talking, I asked Aaron about developing a community of practice.

"You need to ask yourself two questions:
  • What problem are you trying to solve?
  • How does working together solve it?
This is the binder that will hold the community together."

---------------
Later that day, I wandered into the Data Whisperer's cube to talk about various projects.

After sharing the information from my conversation with Aaron, including his advice about developing communities of practice, the Data Whisperer performed one of his intellectual jujitsu moves:

"So Wendy, what problem are you trying to solve?"

-----------
Thing is - I have so MANY problems I need to solve in my learning environment.
  • Terrible search
  • Inflexible reporting
  • Difficult content import
  • Inability to assign stuff
  • No push
  • Limited mobility
  • Ineffective instructional design
  • Simplistic identity verification
  • etc.....

The image below is a pretty good descriptor.







This picture is the 7 of swords from the Rohrig Tarot.

The monster looks really big.  Within it is a bunch of small monsters.

Right now - I see all of the niggly small monsters.
At least with a big monster, it is easier to focus the weapon.

Am I looking at a "kill 10 rats" scenario?  Or is it a boss battle with one BIG rat?

All I know is that my environment - from base assumptions to tools - is massively broken and I need to somehow fix it - and fast....

Wednesday, December 05, 2012

Another Guest Posting?!?!?!?

2 guest posts in one year!  That's more than I've done in almost 6+ years of blogging.

How Wendy prepares for a conference - as published in the ASTD TechKnowledge blog.

Thanks Justin Brusino at ASTD for talking me into this.

Tuesday, December 04, 2012

Defining Success

If you want your organization to perform you're ultimately looking at maximizing the supporting systems that contribute to the network (your organization). This includes people systems, technology systems, policy systems, management systems, etc. To know if any of these systems are failing you need data.
- Reuben Tozeman

Please go read Reuben's entire post.
---------------------
I've been grappling recently with "success".

What does "success" look like?
  • Personally
  • Professionally
  • On projects I am on
  • Within the systems I touch
  • Within my organization - departmental, divisional, university-wide

As Reuben points out - if you want to perform, you are looking at maximizing your systems.
To know if the systems are failing, you need data.  I don't think that Reuben is strictly talking about quantitative data.

If success isn't defined, how do you know whether your systems are failing?

How do you know whether you are marching in the right direction?

Spending time on the right activities?
-----------------
The problem I am facing is that large chunks of how I've been defining "success" in various realms have either been incomplete or felt inauthentic.

Before I determine how to best maximize my systems, it will help for me to get real clear about what direction I want to march in.

Clarity, sadly, has been escaping me at the moment.

Probably because I am still in the throes of the thrash.

Likely, because I am in the very uncomfortable position of needing to allow others to help me define "success". At least in my professional environment.  I might have allowed others too MUCH feedback in my personal environment.

Then there is that seemingly unbridgeable gap between me and that fog that I want to get to.
Not entirely certain what is IN the fog, but I know I need to get there.

Thursday, November 29, 2012

An Attempt to Structure the Problem




 HT to Dave Lee for finding this. 

 --------------------------------
I was trying to figure out what big question my little Learning Architecture / Reporting project is attempting to answer.  I came up with the following:

Read more....

Tuesday, November 13, 2012

Job vs Career vs Vocation

I would have liked the face in the middle to look more confused - that would more accurately represent my current state.  Too lazy to make the edits.

From Lifehack.org

-----------------------------------
Job - Merriam Webster definition 3
a)(1) something that has to be done : task (2) : an undertaking requiring unusual exertion 
b) a specific duty, role, or function
c) a regular remunerative position

Career - Merriam Webster definition 3
A field for or pursuit of consecutive progressive achievement especially in public, professional or business life


Vocation - Merriam Webster definition 1a
a summons or strong inclination to a particular state or course of action

------------------------------------
I was asked by a friend of mine recently whether I saw what I did as a job or a career.

At the time, I stated that it was more of a career.  But when I said it, it didn't feel quite right.

The issue I am running into is that I find myself wanting to build my toolkit to get something specific done vs. looking for opportunities to "expand my career."

What I want to do doesn't fit into the model of "consecutive progressive achievement".  

I'm having a tough time getting excited about the march from beginner to specialist to expert to manager to director to whatever. 

And, honestly, I haven't been trained / educated / indoctrinated in anything I currently find myself compelled to do.

I dig creation.

I am finding this compulsive need to create something. Something really big. Something that could even be (dare I say) game-changing for our organization and staff education if we even get this half-way right.

This compulsion seems to be over-riding other opportunities.
Anything that crosses my path is measured against whether it will help me make my vision a reality.

It's been a very long time since I have been this obsessive about something.

I guess I should answer that I see what I am currently doing as a vocation.

Can you relate?

Thursday, November 08, 2012

Playing with the Surface

This may be the first time EVER that I have been one of the first folks with a new hot toy.

As I mentioned in the last post - I have to replace my personal laptop.

I tried to use my Kindle Fire as that replacement.  I love it, but I am finding it is a little TOO minimal.

I tried to use the iPad. Despite the improvements to the Apple app ecosystem and to the hardware, I still find it to be a better consumption tool than production tool. And sticking peripherals on the thing seems to defeat the purpose of the product.

So I went out on Saturday and bought myself a Surface.

Thoughts so far:

Digging the touch keyboard. Took a little while to realize that it truly behaves just like a standard laptop / PC keyboard vs a touchscreen keyboard. I got a red touch keyboard - so for whatever reason, my fingers are expecting the Shift key to work like a touchscreen (touch the shift THEN touch the key).  I found that disconnect interesting. 

The big thing with the keyboard is figuring out the appropriate amount of pressure. My weird (yet surprisingly fast) pound-on-the-keys style seems to work well with this keyboard. If I were more of a traditional touch typist, I could see how this keyboard could be problematic. The finger positioning is a bit too subtle and it might require more concentration to get the appropriate pressure on the keys.
-------
After the initial boot - this thing boots up fast. Changes and loads the apps fast. Connects to the internet fast. Considering how long my big fancy Win7 machine takes to boot these days - I'm a real fan of the speed. And I only got the 32 GB version.

IE 10 loads pages quickly. Remembering that the url bar is at the bottom rather than the top will take some getting used to. So will finding some of the features - such as bookmarks (or pins)

Setup was also really easy with my Hotmail account.
-----
Once I figured out the side swipes and the context-sensitive settings, setting this thing up was pretty easy. This might be the first time I have been able to get all of my email addresses in one place.  Same with social media.

Except for my blogs. That's still web accessed. I need to think about a better workflow for that.

Oh, and IM/Chat. I am hoping some app that allows me to incorporate my IM/Chat functions (all one of them - my work IM, powered by Google) is on the roadmap really really soon.
------
The Office 2013 suite opens in the Windows Desktop (or what I'm gonna call the Old Skool Windows app). As a result, it seems like there are a couple of layers to access Word, PowerPoint, etc. The desktop, then the application.  I have a feeling that this may prove to be aggravating - especially if I try to do a copy/paste from Word into a blog.
------
One Note seems to be the central Office app that will help make this infrastructure truly run. It is essentially Microsoft's version of Evernote.  I've never been much of an Evernote user - but I am going to have to get better at using note taking applications if I have even a fighting chance of fully leveraging the promise of anytime, anywhere, any device accessibility and productivity.
------
One pattern the mobility guru and I are seeing with Microsoft, Apple and Google - these new environments STRONGLY encourage the use of the cloud. THEIR cloud. Unless as an enterprise we can figure out a way to make it easy (preferably easier) for a user to plant their docs and files into the secure enterprise environment, this could be very problematic.
-----
Thus far, I like what I've seen so far.

More to be revealed.....

Tuesday, November 06, 2012

Changing the Personal Technology Infrastructure

If you have been following me for any length of time, you know that I am not much of an early adopter. This time is a bit different.

Part of it is timing. My MacBook Pro has essentially melted from the bottom up. Battery - toast. DVD player - doesn't play AND requires me to dig in the slot to yank out any disk I put in there. Keyboard - unreliable. Trackpad - trashed. Even the USB mouse doesn't work reliably anymore.  Essentially - it has become one big iPod charger. A big, expensive iPod charger.

I got about 5 years out of it - so I guess I shouldn't complain too much.

I did not need that expansive (or expensive) a replacement for my personal computer this time around.  When I thought about what I actually do with my personal computer these days - I realize it boiled down to surf the net, do some writing, and charge the iPods.

Part of it is that I see real potential in the ability to access information and files in a device-independent manner.  I am also scheduled to replace both my personal AND professional cell phones. One of them will be Windows 8 Phone.  To see whether this promise is actually fulfilled.  I do like the possibility of only having to use one UI across platforms. 

I also find I am not much of an "app" user. For whatever reason, I still find myself using web versions of things, even on the iPad.  As a result, the lack of app selection in the environment doesn't bother me so much.

Part of it is that our team is going to have to support Windows 8 whether we want to or not.

I work at a University with some pretty wealthy students. This population tends to bring in the "latest and greatest". This bleeds into our faculty and staff population - whether we want them to or not.

Windows 8 is a pretty big change.  Personally and professionally (I am seeing a theme here).

Just another thing that is going to change the way I work.

Tuesday, October 23, 2012

More Cat Herding



Over the past couple of weeks - just when I think I have a plan, someone else throws in another variable.

My initial plan - Get the LMS selection done and implemented.  Leave the Data Whisperer alone until Fall 2013 to let his world shake out.  Then (hopefully) hit the ground running and finally build the learning and development reporting mechanism of my dreams.

Um....yeah......
--------------------------
One of the great things about being a trainer is that I get to touch multiple parts of an organization. 
I may not touch them long - but they at least know who I am.

I was catching up with a fellow IT colleague during a town hall for the greater University community.

"Hey Wendy, what do you know about this other content management enterprise integration project?"

Um....I know it is a pain point for the client and that they need to come up with a better process.  I thought we were going to get something done longer-term.

"Yeah - we just got this in our queue."

Hmmmm......

A later conversation with Sally let me know that another project was afoot.

"I just discovered there was a whole certification table in our enterprise system!  We're going to try and leverage that!"

Didn't know our enterprise system did that.

"Yeah - neither did we.  This should be cool."

Hmmmm.....
------------------
So despite my plan to leave the Data Whisperer alone until Fall, my management chain tasked me to go talk to him.

I wandered over to DataWorld with a list of notes and a number of questions - along with the admission that I am going into the conversation very confused.

"Make that two of us."

-----------------
I've always believed that information should be free and that knowledge is only powerful if it is shared.

He laid out what he knew.
I laid out mine.

1 hour later - we had a better plan and an idea of the role each of us plays in each "project".
------------------
I am suspecting that this may be the first of many "emergencies".

This gives me an important variable when we go to design the new reporting system.

Despite our intense desire to make everyone "conform" to standards - chances are we are going to have to accommodate multiple content libraries and management systems, multiple course management systems, multiple LMSs, and other sources of random input.

This could actually be a really good thing. Potentially giving us a much better sense of what is actually happening in our environment vs. sticking our head in the sand and INSISTING that everyone follow our lead, use our tools, and cooperate.

I work in Higher Education.  I herd cats.  The chances of us being able to get the cats to follow is practically nil.  Instead of fighting it, why not go with it?



Thursday, October 18, 2012

Doin' a MOOC



eLearning folks have been talking about MOOCs (Massively Open Online Courses)for awhile.

Like "Mobile" and "Gamification" and "Analytics" - MOOC is becoming another buzzword that is being thrown around our environment.  As in "Can we make this a MOOC?" 

Implied in that is they want to know if we have (or can purchase) an application that "makes our course a MOOC" vs. the hard work of designing a course as a MOOC and the support structure behind running those courses.

---------------------
I hesitate to attempt to speak intelligently about something until I have actually done it.

To get a better idea of what a MOOC is, I found a course where I had decent background knowledge - PowerSearching with Google.  This way, I could better focus on how the course is designed and structured.

As I participated in the course, I realized "This is circa 2002 Distance Education with a couple of minor tweaks!"   Just this time, it has a cooler acronym.

I am defining "Distance Education" here as those time-limited multi-day courses that attempt to replicate / improve on the classroom experience without the classroom.  Often seen as semesters, but I've seen it as short as a week.
-----------------------------
2002 - lectures / presentations appear at designated time. Most of the ones I saw were PowerPoints.  If I was lucky (and had really good internet access) - they had audio. Often these are the same length as the original "classroom" lecture (an hour is pretty popular).

2012 - lectures / presentations appear at designated time.  These are chunked much more finely into topics (in the PowerSearching with Google case - about 3-9 minutes in length).  Activities in between.  PowerSearching with Google used a lot of video.  It reminded me of 80s-era "learn to use your computer over educational television" - but they still used video. Today - the video quality is much better.


Digging the couch and the old theater curtains.

Here is an old Computer Chronicles from 1985.  This was sold to Public Television stations as a way to fill daytime.

-------------------------
2002 - community-building through message boards. Week 1 - getting to know you.  Participation matrix is important for evaluation.

2012 - community-building through chat, forums and virtual conferencing (like Google Hangouts - we can see FACES!).  Week 1 - getting to know you.  Participation in any of this optional - but strongly encouraged and designed in the assignments and activities vs treated separately.  Participation may not limited to the "official course forum". Jim Groom and Alan Levine's DS106 course at the University of Mary Washington is a great example of how this can work.

The DS106 class is a semester long MOOC - and does have grading for participation, structure and focus for each time period.   Clues to one possible structure can be found in the syllabus for Summer 2012.
----------------
2002 - assignments due at a particular time.  Submitted in a particular way (WebCT was my exposure to this).

2012 - assignments might be time limited or might not. Submissions can occur over multiple media over multiple locations (Twitter, Facebook, personal blog, YouTube....) Generally - it seems to be best to follow the course from week to week to be able to maximize your participation in the community (since everyone participating will be focused on the same thing) and the facilitators.

Dave Cormier, George Siemens and Stephen Downes are finishing up a cycle of Change in Formal Education Systems.
A new cycle appears to have started a few weeks ago.  Click here to see the schedule.

------------------
One of the areas where I am a bit cloudy is administration.  What it is like from the instructor perspective.  How does it work with a class of a few thousand (not that I am going to try that off the bat)?

I managed to find a few clues in this Educause paper.

-----------------------
Sally and I MIGHT be trying to MOOC our Telecommuting onboarding series. for the next round.
At least - I stuck the bug in her ear.

We are thinking that the population is reasonably small (about 60 per class), some folks in this next cohort will have difficulty getting away for an hour during designated course times, and it might be a safe opportunity to experiment with the format.

Couldn't hurt.....
--------------------------
Some articles by folks who have a much better idea about what a MOOC is and how it works than I do.
Review of MOOC Developments
Impressive MOOCs You Never Hear About

Tuesday, October 16, 2012

Jumping Too Quickly

The Manager, the Director and I sat around a small table staring at feedback for our (not so) little LMS project.  We are in the process of walking this project through our relatively new project intake process and have asked for feedback from some of the executives - including the SWAT team leader and the Data Whisperer.

'So are you looking at this as an augmentation to an existing system or as a new system?  Requirements collection will be really different depending on the direction you all are headed.'

I have already started the process of culling through all of the comments and complaints over the past few years regarding our LMS.  Using the Communications Point Man's example - I am pretty much dumping everything and anything into this initial draft of the requirements.  I have found over the years that having at least something as a discussion point makes it easier for folks to provide feedback.  Rapid Prototyping FTW!

One lesson I learned is that sometimes you have to go with the flow to get stuff done.

I mentioned in a previous post my thought that Compliance Reporting could potentially be an easy "kill."
Compliance reporting - however - is not terribly sexy.  Nor does it address the myriad other issues in our current process and tool set.  There is also a possibility we can address it within the LMS project itself.

During our conversation - I was reminded of was the importance of timing.  There are a few things occurring in our environment that makes separating and delaying the reporting piece of this project very desirable:
  • A data cleanup of our HRIS system (long overdue)
  • A major re-org in the Data Whisperer's world. 
  • Because of said re-org - the Data team's priorities might change.  Hopefully in our favor, once the dust settles.
  • If we wind up getting a new or different or added LMS - we would be creating duplicate work if we decide to completely discard the old one.
I've been kicking around the idea of separating the reporting piece from the LMS project altogether.
  • It might allow us more time and focus for determining what reports the stakeholders really need.  It would be cool to find an underlying strategy behind it all as a result of those conversations.  Not holding my breath there.
  • We would be able to focus on what inputs would need to be collected - because there is no WAY one LMS is going to capture everything in our environment. I am also not convinced that whatever LMS we wind up with should serve as a "training portal".  Too tough to access and there are likely better tools for the job.  I came to this conclusion through hard experience :(
  • We could work on something that would scale to different inputs (such as SharePoint, Drupal pages, the Student LMS, external content systems, new enterprise products).
  • We will have a much easier time keeping the sensitive information in-house vs. grappling with our Legal department and the vendor.
  • We would be able to connect to our business analytics without having to run a number of updates on a regular basis and run the risk of outdated material or accidentally erasing stuff.
So this is what we decided to do (at least as I currently understand it)
  • Approach the LMS project as a "new LMS".  This will require more comprehensive requirements gathering - which is needed anyway.  It's been awhile since the stakeholders have had a true "training" discussion and the environment has changed significantly since the last time we chatted.
  • Separate out reporting as a whole 'nother project for after implementation.  That should give time for some of those environmental factors to shake out.
  • Attempt to get a decision made in the March-May timeframe so we can either implement the new or reconfigure the old before Sept 1, 2013.
The plan is coming together.  Next step - see if the senior execs think this is a decent idea.

Thursday, October 11, 2012

Trends and Gaps

In preparation for my conversation with the SWAT team leader - I put together one of those text-heavy PowerPoints, if only to sort out my findings to date and give me some points of conversation.

Here's the PowerPoint.  

I didn't have to inflict the whole thing on him for him to see some angles of attack for a Learning Environment redesign.

1) Map the process this fits into.

I have been approaching the Learning Environment based on MY needs.  Yes, I should know better.  Don't judge.

The SWAT team leader's recommendation (done somewhat obliquely through his demonstration of ASU's fantastic Research Administration help site) was to look at the process the problem fits into.

As I thought about it - the training I do fits into the Employee life-cycle. This needs more concrete definition, but right off the top of my head, this is what I came up with:
  • Onboarding
  • The march from beginner to expert
  • Compliance issues and requirements during employment
  • Career development
  • Termination (and here I am including retirement, voluntary departure, firing)
The issues our training groups face embed within this life-cycle. 
  • Finding help - both information and people (experts)
  • Reporting (what did people take? When? Did they take all necessary mandatory training?  Nevermind the more advanced stuff that includes business analytics.)
  • How best to determine appropriate curricula for each level of employee.
2) Think about a development roadmap based on that process and the pain points within the process.

This idea has popped up a couple of times over the past month.  I've had one in my head.  I'm not sure if one actually exists for our training program elsewhere.  If not - it is definitely time to put one together.

3) Once the process (as it currently stands) and the roadmap for improvement are defined, choose 2 or 3 things to focus on over the next year.

I kept staring at my strange little PowerPoint and two things jumped out at me:
  • Compliance.  Almost every request I have received from outside the IT department over the past year has been centered around Compliance.  The pain points (gaps) I had identified in our process have at least one compliance element in them.  From terrible instructional design, to awkward approval processes, to manual reporting. 

  • Reporting. From what I can tell (and maybe it's because I am a trainer), training is a critical element of Compliance.  More grant and regulatory agencies are wanting evidence of "training" in their pet topic.  In conversations around the organization, I've been repeatedly told that being able to provide this evidence is key.  (Nevermind actually reducing the number of compliance cases and/or reducing the average size of the reward).
(Of course, this begs the question as to WHY compliance has become the thing we focus on vs. performance improvement / innovation / whether or not our training actually did something useful etc.  Probably best I don't go there....)

I then thought about my recent conversations with the Data Whisperer, the SWAT Team Leader, Syd and Sally...

What if we could use Compliance Reporting as the base?  That is an immediate need.  It would serve as a driver to help us get our baseline L&D reporting in line before we start adding non-traditional inputs (such as business data). And it could help each of member of the project reach their strategic objectives - which right now seem to center around analytics.

Makes sense right now as I type this.  I may also be jumping to conclusions to quickly.....

--------------------------
David Jones has a couple of recent posts that provide food for thought on this.

Compliance Cultures and Transforming the Quality of eLearning
The Core Problem with Learning Analytics

Tuesday, October 09, 2012

Desperately Peeking Over the Silo

One of the bonuses of the Thrash is that I am more open to new sources of input.
It also reminds me that my ADD generalist leanings occasionally come in handy.

Read more....

Thursday, October 04, 2012

In the Throes of the Thrash


Thrash metal and puppets.
Doesn't make the process any easier - but it does make it a heck of a lot more fun to listen to.
 -----------------------
I've mentioned before that I am looking to completely redesign the learning environment at my organization.

Read more...

Wednesday, October 03, 2012

In the Arena


In my mind, one of the greatest songs REM ever recorded.
---------------------
“It is not the critic who counts; not the man who points out how the strong man stumbles, or where the doer of deeds could have done them better. The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood, who strives valiantly; who errs and comes short again and again; because there is not effort without error and shortcomings; but who does actually strive to do the deed; who knows the great enthusiasm, the great devotion, who spends himself in a worthy cause, who at the best knows in the end the triumph of high achievement and who at the worst, if he fails, at least he fails while daring greatly. So that his place shall never be with those cold and timid souls who know neither victory nor defeat.” - Theodore Roosevelt

---------------------------
I've run into the above quote a lot recently.

Along with the notions that
  • Success is built on failure
  • Hope is a function of struggle
  • Sometimes the bravest thing to do is to just show up.
I don't know if anyone else is sensing this - but we are in the midst of some major changes in the corporate learning space and in the way we need to think about things.

I'm seeing bits and pieces in the conversations surrounding mobility, performance support, analytics, informal learning etc.

The big flashing message I'm seeing locally is "Your system is broken!!!!"

The realization that everything I have been doing since getting my Instructional Technology degree in 2003 doesn't quite work anymore is more than a little unnerving.

Losing one's "religion" is a tricky thing..


Tuesday, October 02, 2012

Building a Mobile Strategy Part 2: The Troika

Building a Mobile Strategy Part 1
Building a Mobile Strategy Part 1a

The Mobility Guru and I have been holding monthly meetings - not counting casual encounters where we run into each other on campus.

We've learned a few things over the past couple of months:
  • Once our web team finishes implementing our new content management system (they are really close to closing the project), all hands on that team turn to mobile.
  • There is a "Mobility Steering Committee" among some senior execs around campus.  This, however, seems to be focused on an app.  Good for the Mobility Guru and I because that means we are not missing anything that we might possibly need to know right now.
  • There are some other scattered "Mobility" efforts - most notably among the Security folks.  This provided an opportunity for awareness.  Better late than never.
We finally managed to have a full Troika meeting recently.  The Mobility Architect finally got out from underneath his configuration management project to talk.

Some good stuff came out of this meeting:

1) Both the Mobility Architect AND the Mobility Guru realized fairly quickly that our current policies don't adequately cover the new environment.  We think there is an effort underway to take a new look at all of our IT policies - particularly for data management.

2) A trend both of them noticed is that mobile OS providers (Apple, Google, Microsoft) are beginning to strongly encourage the use of their own cloud storage solutions.  This could be quite problematic for an enterprise since the temptation to put any-ol-document onto the "personal" cloud share for the sake of convenience is quite high. 

Conclusion: We need to get the definitions of what is appropriate on the cloud shares (and mobile devices in general) and what abso-positively-lutely does NOT belong on these cloud shares but behind secure document management needs to be very clear.  My argument - the less grey area, the better.

My 2nd conclusion: Any solution created for this needs to be really easy to use.  My thinking - ideally it would be as easy (or even easier) to put mobile stuff in the secure area vs the "mobile-phone-OS-approved" cloud share.

Easier it is - less I need to do training on it - more likely the solution will be adopted.  Win for everyone!

3) My role in all this - voice of the end-user and lcd (lowest common denominator) tester for any processes that come out of this.  I am perfectly suited for this because a) I'm an application trainer and notorious for breaking things and b) because I suck at using my smart phone.

I got involved because I wanted to make sure my mobile strategy followed what the rest of IT was doing. 

Still doing that - but this other role promises to be a lot of fun.

Thursday, September 27, 2012

End-to-End

I was sitting in a conference room last week with the IT Den Mother (also known as the Executive Assistant to the mucky muck) waiting for the mobile video teleconferencing unit to be wheeled in for a meeting.

I was trying to cancel one night of a two night reservation for a contractor we have coming in.  I couldn't do it - so I called up the reservation desk of the hotel.  The reservationist was using the same system and she couldn't do it either.  Something that would have taken 5 minutes with a well placed phone call before now took 30 - and still had to be done by hand.  Why do software designers design in pieces?  Why can't they just make it easy for me to go through a process?


I ask myself this with almost every application I train. 
It's actually one of the most important pieces of my job - translate the individual pieces of the application (many of which were designed and coded by teams that don't talk to each other) into something resembling a workflow.

It's the piecemeal cobbling together of different bits of code from different teams (and possibly different companies as organizations merge or get eaten) that create the little quirks that make a number of applications so hard to use.

  • Search in this field using this technique
  • Search in that field using this other technique
  • Use tabs and drop-down menus to navigate in this section
  • Use left hand menus and separate pages to navigate in another section
  • Use the standard model for navigating the application - unless you are trying to do this special, but super-important, thing that requires an entirely different workflow.
This next year - I am on a couple of projects that promise to change the way the University does things.
Big strategic and cultural change projects.

I'm going to have to be the IT Den Mother's voice.

How can we make it easy for the folks we work with to do what they need to do?

Tuesday, September 25, 2012

TDRp: The Reports

In the TDRp - once you have the statements, you can then generate reports.
Essentially - re-packaged statements.

The white paper currently recommends three reports:
  • The Summary Report - goes to high-level executives and shows progress against high-level goals
  • The Program Report - individual reports for each program for the learning execs
  • The Operations Report - an overall view of course usage and L&D costs - pretty similar to the high-level efficiency statement, but with some project management numbers for items under development.
Looking at where our current gaps are - the Operations Report might be the easiest of the 3 to implement.
We have a project management system we can get those numbers from.  And the LMS takes care of course utilization numbers.  The biggest block to implementing this report is the cost numbers and where those might be housed.  I am hoping that my discomfort with whether we can get this information in an automated way is more a result of my ignorance of our budgeting and purchasing system vs. reality.

The Summary Report and the Program Report requires defined programs. 
Something I talked about in an earlier post.
Without those defined programs, both of these reports will be impossible to generate.
Once we DO have those programs - the money numbers and where those are housed come into play.

Thankfully - getting the cost numbers in a place where they can be easily extracted will take care of the gap in all 3 reports.

I suspect the strategy / program discussion will be a more difficult issue.

--------------------
So the heartening thing about this analysis is that we are not as far away from being able to implement this model as I feared.  We're working with more than I originally thought.

There is still a LOT of work to do in the meantime.

The exercise helped me.  I hope it helped you too.....


Thursday, September 20, 2012

TDRp: High Level Business Outcome Statement


















This one will be the toughest nut to crack.

Mostly because we don't have a central decision-maker leading our training initiatives. 
And the greater strategic plan seems a bit vague from my view at the bottom.

No real "programs"
No solidly defined objectives
No quantitative goals that actually align to those objectives
Activities that (in the best case) only roughly move towards the ill-defined goal and objective.

I say this because I know for fact that our organization is not alone.

To my current employers credit - they are working hard to move that direction.
Baby steps.... but any progress is good progress.

Until there is some sort of focused initiative or defined set of business outcomes that the entire institution is expected to work towards, we will be hard-pressed to even begin coming up with this statement.  Much less the appropriate measurements.

(I am expecting to hear an "I told you so" from the Data Whisperer any day now....)

Tuesday, September 18, 2012

TDRp: High Level Effectiveness Statement


















The big thing stopping us from quickly implementing this statement is surveys.

Various groups are better at implementing surveys than others.  And we don't have a standard survey tool.

Of course, our training "organization" is organized like Al Qaida without the single-minded focus (a whole 'nother issue).

We would also need to decide what programs are worth the time spent surveying. 

We can't just do blanket surveying because our organization is very averse to distributing surveys without approval from various groups that seem to appear out of the woodwork whenever one threatens to try to get information out of people - such as "did you like the course".

There is also that "truthiness" factor.

How many of you just circled 4s or 5s on a survey and over-estimated how "important" the class was to your job just to a) get out of there and b) justify taking more classes later?

Or am I alone in this one......

-----------------------
Level 5 numbers may need to be collected using a combination of duct tape, string and chewing gum. data resources.

Again - I need to find out where these numbers are stored and which systems we would need to pull the data from.  I fear that there not be a real "system" at all.

I'll probably spend more time calculating numbers once I find them.
----------------------

ROI numbers can be calculated using the formulas I noted in the post
Calculating Return on Investment and Benefit / Cost Ratio

Other useful learning metric posts:
- Monetizing Program Effect
- Another Way of Looking at Learning Metrics

Thursday, September 13, 2012

TDRp: High Level Efficiency Statement


















This should be the easiest statement for us to implement.  Many of these numbers are already available in our LMS.  The big challenge will be capturing learning object types since we need to better differentiate between classroom instructor-led training, virtual instructor-led training and blended training.

The bigger version of this document (found in Appendix II of the full white paper) also includes Utilization, Reach and Development metrics.  Utilization (how popular the courses are) and Reach (the percentage of employees who used the resources) can also be at least estimated from our LMS.

The program management numbers can be estimated from our project management system - though these numbers are a bit rougher due to some project intake workflow issues my division is currently working through.
 ---------------------------
The TDRp also accounts for other types of learning interventions such as social media, performance support tools, and other types of "non-course" interventions.  The defined learning types in this example just looked like an easy kill since we already have these in our LMS.  Furthermore, we don't currently have tracking of the other types of interventions - at least, not without begging our web people to help us out. And that, honestly, would be a pretty large project.

I have suckered convinced one of my current project teams to allow me to use them as lab rats experiment with ways of differentiating the virtual sessions from the classroom sessions.  I have gotten some excellent feedback from them and hope this will help me get some reports separating classroom / virtual / blended.

------------------------------
The second, more pressing, challenge is with costs.

I would like to think that our costs are tracked in some sophisticated system that exports in an easy-to-read, easy-to-manipulate format.

I strongly suspect that our costs are found in various pieces of paper, napkins, spreadsheets and electronic detritus.

I'm not the money person in our department, so I am not privy to how they track these numbers.

--------------------
The detailed version of the Efficiency statement separates this high-level information into various important programs.

I think once our organization actually defines what these important programs are, and what is in these programs, we might be able to create one of these statements.

That's not a data / computer issue. 

That is a people / decision-making issue.

Not sure what to do about that one.

Monday, September 10, 2012

Teaching a Thought Process


On a bit of an LCD Soundsystem kick.  Not sure this has anything to do with this post.  
My blog...I can do what I want.

------------------------
During one of my earlier zombie project reanimation efforts, the Data Whisperer asked me
"How can we teach people how to understand databases, how they work, how to design them and how to get information out of them?"

A year or so after that conversation, staring at a project intake document for a project called "LMS Data Integration v3," I thought "Guess we are just about to find out."  

Our content library has spiffy courses like "The Logical and Physical Database Design Methodologies" and "Introduction to Relational Databases".  Just reading the descriptions and objectives for these courses hurt my head.  And I have a lot more exposure to this sort of stuff than the audience we had in mind.  I don't see our audience making it through 30 minutes of these courses without throwing bricks at their monitors.

How do we get non-technical folks who have a vague idea of what they are looking for to think in terms of defining the information they need, the format they need it in, and where to get the pieces from?  If only to better help the technical folks help them.

--------------
As I've mentioned in a previous post - I've been picking through the Talent Development Reporting Principles (TDRp) information.

Because I figured if I can define the rules, I can win the game :)

That and now I have a document I can point execs and co-workers to that is written by folks that are not me.

Familiarity breeds contempt, ya know.

Sometime this weekend, while watching RGIII pick apart the New Orleans Saints defense, it dawned on me...

Backwards design!

-----------------
The TDRp white papers have sample reports.  What if I looked at each of those reports and figured out where we would get the information for each section from?

Maybe if we figured out what our end-state reports would look like, we would have a road map towards developing data systems that would give us what we need.

The analysis would (hopefully) give me an idea of what I can do now, what needs to be changed, and where (or if) I can get the other information.

Over the next few posts - let's see whether this idea works.

Thursday, September 06, 2012

Live Streaming Test Results

(Update:

UStream recording of our Bocce match

--------------
Our results focused on the following:
- How aggravating was it to set up and administer
- How aggravating was it for the end-user to access and watch

We didn't really worry about the participation piece - but I made comments on that anyway.
Because I know someone will ask.

Blackboard Collaborate 12 - We have been using Elluminate v10 and decided to test the most recent version of Blackboard Collaborate.  Better video streaming and audio quality than prior versions - by a lot.  I wouldn't wish video conferencing in Elluminate v10 on my enemies. The user needs to know how to expand the screen.  Otherwise, they are going to be looking at teeny tiny people in a small window. At least until we can figure out how to auto-expand the window for the viewer in the new meeting interface.  The built in interactivity features are still available.

Biggest bonus for us - it is currently our main web meeting tool, so we won't have to go through the security process.  The other thing we like about Blackboard Collaborate is that we can more easily limit who can access the live meeting and the resulting recording.  We were also familiar with setup and sending meetings - so we can't really comment on the learning curve for video-meeting administration.

The Blackboard Collaborate 12 UI is very different from Elluminate v10 and requires some getting used to.  This slowed our test setup down a bit. 

UStream - I played with this at Innovations in eLearning 2012.  Not bad with my netbook and cheap webcam.  This time, we had a nicer (larger) laptop with better multimedia cards and (slightly) more stable wireless.  On the broadcasting side, the video was choppy.  However, the viewers reported that everything seemed smooth and audio/video quality was decent.. Chat and social streams are available, but they are awkward to get to in the interface.

This would be best used for things that you want to be fully public and where you don't really care about interaction.  I know our HR team is planning on using this service for some of their Service Excellence keynotes.  There are limits on free UStream (100 viewer hours) and they are more aggressively marketing the paid services, so we need to keep an eye on that.

Google + Hangouts - We looked at Google+ Hangouts since Google is adding the Google+ features to the enterprise services.  Great for 12 person conversations and meetings.  Not quite appropriate for the application we were considering. Setup was a little awkward through the Events feature in Google +.  Straight Hangouts is interactive via video chat.  Didn't see other types of collaboration.

I will also admit I am not very good at using Google +, so these comments may be reflect a PICNIC (problem in chair, not in computer) error vs an application design flaw. 

Craig Wiggins was kind enough to join us at the beginning with his Android.  He reported that video and audio quality was pretty good (for a cell phone).  He also made us jealous by joining us from the Udvar-Hazy Air and Space Museum.

A couple of features we didn't test: the "Hangouts on air" feature that would have been closer to the application for this tool that we are considering and the collaboration features through Google Drive.  If our organization decides to turn on enterprise Google +, we will investigate these features further.

Our in-house testers said that audio and video quality were good.

Update: I talked to Ben Fielden in person after he posted his comment. When he is talking about above the line vs. below the line, he is talking about what we have purchased (appears in the administrative interface above a line) and the stuff that they want us to purchase (stuff below the administrative interface line).  So if you wonder why your enterprise IT is not terribly excited about implementing that cool new feature you get for free, ask Google.

ooVoo - A non-starter.  Requires a download that also adds one of those aggravating toolbars to all of your browser windows.  No one could get into the link, and even if they could, I wasn't in the position to answer the phone.  A pain to set up.  Just....no thank you.

---------------------
I'm going to send the results to the rest of the team for next steps.

Again - this will be an ad hoc, temporary recommendation while we get our official, formal solution for this type of thing set up.

Tuesday, September 04, 2012

Ad Hoc Live Streaming Test

We are starting to see demand for Live Streaming services.
IT has a project underway that will address that formally.  High quality, fancy equipment and all that.

This is not that project.

We decided to take a little time to test out cheap/free solutions to suggest to our clients while we waited for the big "everything communications" project (somewhere around 2014).
-------------------

We tested 5 tools - Elluminate, Google Hangouts, vTok (for the iPad), UStream, and ooVoo

Yes - I know that some of these tools aren't designed to do live streaming, but we had it lying around.

Also - Skype was taken out of consideration because of security issues.  
People use it anyway - but IT is reserving the right to point and laugh when they have a security issue.

We asked our audience for feedback on the following:
- Video latency - Does the video freeze?  Look jerky?
- Audio latency - Does the audio stop?  Do I get "chipmunking"?
- Ease of access
- Does it freeze the computer?
- Any comments regarding features and user experience.

Testing Protocol
We decided it would be fun to use a game of indoor bocce for the test.


This allowed us to test the following:
- Motion
- Visibility from standard "speaker" distance of faces AND (hopefully) PowerPoint presentations
- Audio from standard "speaker" distance


And how often do you get an entire abandoned space to play in that still has power and wireless?
We had to do SOMETHING fun before we all moved out ;)

Testing environment considerations- This space is in the basement of a dorm.  With all the kids. And their movie streaming.  We felt that this provided the best "worst-case scenario" short of a hotel ballroom (which this space used to be).

- Relative video quality - we are using different web camera setups. We plan to run another test focused on web cameras later.

- Relative audio quality - again, we are using different microphone setups.  We plan to run another test focused on audio input devices later.
----------------
We will have the results of these tests in a later post.  And maybe video :)

Friday, August 31, 2012

Wednesday, August 29, 2012

Not Reinventing the Wheel

I've been nursing some half-baked ideas around analytics for some time now.
I hand these to the Data Whisperer annually - mostly hoping he can work miracles with my unformed mess of stuff.

THIS time, I also handed him the overview version of the TDRp white paper.

I figured I'd take advantage of the efforts of folks much better at numbers and maths and ROI and data and stuff.  In this case - Frank Anderson (former head of DAU), Josh Bersin (Bersin and Associates), Jack Phillips (ROI Institute) among others.

I've been keeping track of this project since the SkillSoft / Knowledge Advisor sessions back in May 2011.
Part 1, Part 2, Part 3.

They have the basic reporting framework built.  The next couple of years will be spent fine-tuning definitions, models and recommendations.

The 3 TDRp reports (Quarterly Summary report for the senior execs, Monthly Program report and Monthly Operations report for the learning execs) parse information from three statements:

- the Outcome statement - Goals and the impact of training on those goals

- the Learning Efficiency statement - L&D cost + opportunity cost, cost reduction, "butts in seats" and courses used

- and the Learning Effectiveness statement - whether folks (students and managers) liked the class, planned to apply the stuff from the class, and estimates of value and impact

---------------------
I've been staring at the white papers over the past couple of days trying to figure out what we would need to come close to this model in our learning analytics.

I came up with the following:

- Objectives.  On both a university-wide level and on an individual "intervention" level (projects, technologies, process improvement initiatives and, yes, training)

- Measurable goals mapped to those objectives.  This would include a measure of "where we are now".

- A way to track who took which class and how they did (traditionally the realm of the LMS).  

- A survey tool.  I am lumping evaluation tools in here.  Ideally - it would be one tool linked into the LMS so I can put this all together in one spot.  It may require 2 separate tools - one that scores (the evaluation tool) and one that just collects information without judgement.

I'm debating whether I want the ability to detach my survey tools from individual learning objects. The reason why I may not want it attached to a particular learning object?  Many of the interventions I design these days are not "courses" but collections of on-demand resources that may or may not be connected to a "course" as we understand them.  Those resources may also cross programs.  I want information from the aggregate of those interventions as well as from the individual pieces.  Need to think on this more.....

- A way to run pre and post metrics (before and after the intervention).  The system would be dependent upon the objective and the goal.  OR if there was one Data Warehouse to rule them all.

Our organization is not there yet.  Actually - let me know if your organization has a Data Warehouse that successfully integrates ALL of your ERP, Finance, Payroll, HR, etc systems and allows end-user folks to easily spit out reports with no aggravation.  (Hey - and if we really want to get buzzwordy, maybe we can even talk about Tin Can here and having ALL of these systems output in the same language so we might have a fighting chance of putting something like that together :)   )

I think the TDRp has a good model for what the reports should look like coming out the other end.
At least - it is a great starting point for further discussion.

I'm hoping that the end-point gives the Data Whisperer and I enough to sink our teeth into...even without objectives and measurable goals.

Saturday, August 25, 2012

I Actually Guest Posted Somewhere!

I probably should have tooted my own horn earlier.

The nice folks at Rustici Software asked me to guest post on their Tin Can Chatter blog.

Tin Can in the Trenches

Reading it a month later, I'm pretty proud of this one :)

Thanks Megan Bowe and Aaron Silver for talking me into this.

Friday, August 24, 2012

Course as Last Resort

As I may have mentioned in these pages (and have definitely mentioned in person) - I don't think I have created a legitimate "course" in a couple of years.

Sure I've created learning objects, quick references and other small chunks of helpful things.
"Courses", however, have tended towards "How do I find help / information?"

From the feedback I've been receiving - this is not a bad thing.

What is interesting is that projects and clients still insist on "courses".  Because they are familiar, comfortable, and a whole infrastructure of support has been built around this idea.

I've been a little bogged down creating a Telecommuting support site for IT recently.
This is a good thing.

It means that I and my colleagues are starting to look at courses as a tool in the training / performance support toolkit vs the only tool available.

So I've decided to play a little game with myself.

What can I do that is NOT a course that would help the end user?

I'm thinking this question will help me continue to dig up new ideas and resources on top of (or instead of) a "training event" that would really help people at the point of need.

It's worth a shot, anyway.

---------------
Please read Jay Cross' article The Game of Course.
Kinda like buzzword bingo - but with money.
Try this in a project meeting at your own risk :)

Wednesday, August 22, 2012

Weapons for Instructional Design

I tried to have the Instructional Design conversation with my clients multiple times over the past few years.

What I found - the client was usually so freaked out about creating an online tutorial or delivering a webinar and the technology surrounding those activities, the last thing they wanted to hear was that they need to redesign the course too.

After too many lost battles and stressed clients, I reduced my "Instructional Design" message to - "Just think small chunks."  Find natural break points and break it there.  That seemed to be understood.

-------------
Still - I want more.....

I want to create training that is engaging.
That has a fighting chance of actually improving the business.

I'm thinking in my program, that will need to be phase 2.

-------------------

Tom Kuhlmann - The Guiding Principle
Steve Boller - iBooks Author
Allen Partridge - Arm Wrestling for Enlightenment
Gary Wise - Myopic Vision

Tuesday, July 24, 2012

Building a Mobile Strategy - Part 1a

As promised, the results of my conversation with the Mobility Guru.
---------------------

What Is Mobile?
After some discussion, the Mobility Guru and I decided to treat mobile based on operating system (iOS, Android, Win8RT) vs. hardware or form factor.  His original definition "can hold it in one hand" doesn't really account for netbooks or other forms that smoothly run pre-mobile operating systems (Like Windows 7, Linux and the Mac OSs).

The complication with Mobile is not necessarily with the form factor.  It's with the OSs, it's range of implementations (especially with Android), and the difficulty handling various websites and existing applications.

We plan to change this definition.  Especially once we see how Win 8 and iOS Mountain Lion work and evolve.  We both hope that the success of these products will successfully merge PC and mobile spaces so we only have to develop one thing for all form factors. 

I would love to not have to treat mobile as an entirely separate beast.  Instead - mobile-first and it seamlessly goes to other form factors without having to accommodate all of the variation. 

We both may be dreaming.

Mobile Goal/Success Definition - IT
Stated goal (cribbed from the Strategic Plan for IT) - "Lead the Mobility Expectations for the Organization". 

Apparently, this is defined as the following:
  • Improve mobile access (which I guess means get rid of dead spots in our physical space)
  • Engage in outreach (no idea what this means)
  • Content-focused  (which is probably "get our existing stuff mobile-friendly")
Real goal - Not have the Super Muck get yelled at by VP and board member types. I can respect that.

I think they missed a real opportunity to think about mobile in a way that makes our business operations more efficient.  But that's a conversation reserved for my managers and something I sadly picked up on as the Mobility Guru and I picked through the Strategic Plan. After the senior executives asked for feedback. My bad.

Mobile Goal/Success Definition - Training
Training and support - As Needed, Where Needed


IT General Mobile Strategy
Still in Development.  Or as the Mobility Guru put it - it will be determined by whoever screams the loudest.  Guess that's us :)

Device Support
We are assuming BYOD, whether it is stated or not.  Because that is the reality.
We have decided that getting stuff to work on Blackberries will be lowest priority moving forward.
Unless some miracle occurs that saves RIM.

We seriously hope that HTML5, as it matures, will help reduce concerns with content across the various OSs and hardware.




Who else needs to be involved in this discussion?
Mobility Guru and I identified a couple of areas we need to reach out to.

The Web Folks.  Since a lot of stuff will be delivered through their shop, we need to make sure we are aligned with what they already have planned.  Plus they have toys.  We like toys.

Help Desk Reps.  I'm at a University.  Ideally, we'd have one rep for students and one for staff/faculty. 

An IT Architect.  We have one already part of the team, but he's going to be spending the next 6 months figuring out where our entire system is and creating some configuration management.  And that's just the hardware.  If applications were added to that job, he could spend his entire career figuring out what we've got.  We'll keep him in the loop.
----------------------

So how does this all impact Training?
I took a lot out of this discussion and have a better idea of where I want to head - reserving the right to change course at any time.  I'm gonna call that "agility" :)

Thought #1
We are going to look at 2 tiers of offerings for mobile:
  • Cellular-friendly - text, minimal to no images, no video.  The goal is to minimize the cost to the university and/or employee
  • WiFi supported - this will be all of the multimedia goodies.
Mobility Guru suggested that when we start developing mobile-friendly materials, that we specify whether it is cell-friendly or if it needs to be accessed via WiFi.  Basically - a replacement for my standard "how to work the tutorial" introduction.  Awesome idea.

Thought #2
Leverage vendor content.

With as rapidly as our applications are updating and changing (please see Google),  leveraging vendor content becomes even MORE critical.  Simply because we, as an organization, won't be able to keep up with all of the updates.

We already started focusing on organization-specific tutorials.  I think we need to get better at this.

Thought #3
Design and prioritize performance support

I've already been doing this - especially over the past couple of years. 
My clients, however, have been thinking "courses". 

An argument I heard from a manager was "They can do training while traveling on the shuttle.  It's still work time."  True.  I personally find that time to be valuable thinking and decompression time.  I get LOTS of ideas while not actively engaged in "gotta do" stuff.  It makes me more productive.  But I get where he is coming from.

Thought #4
Web apps first

There is just too much variance in our environment to be able to maintain native apps.

Fortunately - Mobility Guru and his friends are implementing a mobile device management system this fall.  He tells me there are cool things in there I might be able to leverage. 

I'll get a chance to see that up close, since it looks like I'll be creating support materials on it...

--------------------
Mobility Guru and I decided that chatting about this monthly will be useful. 

This will hopefully allow us to fine-tune our plans and maintain some level of focused flexibility as this whole plan evolves.

Thursday, July 12, 2012

Building a Mobile Strategy - Part 1

My first step: Align anything mobile I am doing with IT.
I have it easy - I am IN the IT department.  However, it is still very important that anything I do maps closely with what the rest of the department is doing (or wants to do).

The more closely I map to IT - the better the support.

In my mind - IT is the group who should be driving mobile device decisions.
NOT the training department.
Why?  Because they are the folks who will be fixing them.

Except for the most gadget-friendly among us, IT folks will also know more about what is out there, what is good and bad, and what they see coming.  Because they will have to support those devices and apps whether they want to or not.

The IT folks I know have a pretty healthy regard for self-preservation and don't like surprises. They also like looking competent - and most will research an issue above and beyond anything you can conceive of so they can a) fix the problem and b) not be surprised again.  Respect these traits.

Make independent mobile decisions and you will have some very unhappy IT support folks.
Unhappy IT support folks will probably not help the success of your program.

(OK - I am getting off my soapbox)
----------------------
Below I am sharing my questionnaire for my conversation with the Mobility Guru next week.

I will fill in the answers (as much as I can) after our conversation.

I have already given him fair warning :)
-------------------
What is mobile? (IT Dept definition):


Mobile Goal / Success Definition
(IT):


(Training):



Who are the stakeholders?



What other projects involved / impacted?



Device support
Preferred devices?:


Current, most popular devices?


BYOD?  Expectation that BYOD will happen anyway?



Does a general mobile strategy exist yet for the IT department?


Who else needs to be involved in this discussion?


How often do we need to review this?


How can we make it easy for folks to get EVERYTHING they need when working at the University, no matter where they are at?

Solution 1:
What involved / info:
Resources needed: 

Solution 2:
What involved / info:
Resources needed: 
(rinse and repeat)







Wednesday, July 11, 2012

Captivate 6 vs. Articulate Storyline

I am a Captivate shop.
Have been since Macromedia existed (Captivate 2?)
I've used Captivate through UI changes, a couple squirelly updates and the long wait for right-click functionality.

So I go at this tool evaluation more than a little biased.  Because dangit - change is hard and scary.
Especially changing long-standing habits.

But I have more folks I need to support now.  They deserve tools that work for THEM, not just for me.

I was heartened to see Adobe address the HTML5 concerns I had. Still, something in my gut is making me just a little uncomfortable with the way Adobe is headed with this product.  And I have never had end-users seem so excited about a development tool.  So I had to go take a look at the shiny new thing called Articulate Storyline.

I am thinking that NOW is the time to make a change if I need to make a change.  It's the beginning of the fiscal year. I have not purchased the Captivate 6 upgrade yet (though I will need to since I have some folks in the pipeline), and if I have to retire Captivate as a solution - I can do so cleanly with this final version.


I've been spending this past week putting both products through their paces. I created the same quick tutorial (how to use Google Help) with both products.  Exact same steps.  I won't be sharing the tutorial links here since I haven't asked my bosses for permission.
Below is pretty much a copy of the document I shared with my management team.  It is still a work in progress, but at least it gives you an idea of how I am testing.  Noticed how I focused on what I need to do NOW vs. shiny "new" features.
-----------------------------------

Reasons for looking at changing development tools:
  • Increasing use of mobile for training.
  • Discomfort with Adobe’s mobile strategy for Captivate
  • Feedback that Captivate is tricky to learn and not terribly intuitive.
    • May be overkill for most eLearning client’s needs
  • Unusually positive feedback from Articulate Storyline users during conferences.
    • Many of these users were Captivate users
    • Other tools do not adequately address software simulation


Decision: (not made yet)


Cost
Articulate Storyline
Academic and group discounts available.  
Licensing will need to be negotiated

Captivate 6
Confidential


Useability and Technical Comments
Note: Audio tested in a room with a really large air conditioning vent blowing very cold air.  This did an excellent job of testing any possible noise-cancelling feature in the product.

Articulate Storyline
  • PowerPoint-like interface makes it easier to teach and less intimidating for new users
  • Pre-developed interactions very easy to edit
  • Default to end of slide vs. having to reset the timing for every object like in Captivate
  • Love the scene organization - makes it easier to find and edit sections of the tutorial
  • Movies - really easy to create.  Best screen-capture.
  • Very easy to add objects
  • Audio-recording - weakest part of the product.    Would be better off using Audacity, then adding audio to the project.
  • Have to post any HTML output to test.
  • Keeps ALL items separate - resulting item not as large.


Captivate 6
  • REALLY slow.  Especially initial loading.  Took a couple of minutes to get the product started.  Captivate 5.5 takes about 30 seconds.
  • Themes override ANY settings in your presentation.  This is a “use with caution” feature.
  • The captions not picking up the names of the links.  Just gives a generic.  Captivate 5.5 did a MUCH better job of naming the links.  This will slow down production.
  • Screen capture very slightly better on Captivate 6 than on Captivate 5.5.  Not as good as Articulate.
  • Styles = much quicker caption change vs. Articulate.
  • Firefox plugin crashes when posted and tested on PC
  • Audio slightly better than Articulate Storyline.


Mobile Test (got rid of the table - HT to @oxala75 for pointing out the display problem)

Note: HTML5 output only displays correctly on HTML5-compatible browsers.  Technology not quite ready for prime-time.  (I posted HTML and HTML5 publications on an FTP site our team uses for evaluation and testing of online tutorials.)


Note: Tutorials generated not particularly appropriate for phone-size devices.  Android test (using Samsung Virgin Mobile SPH-M910

Kindle Fire HTML5
Articulate Storyline  Works.  Get “Loading Video” in between each screen.  Can click on Marker to see what happens next.  Is a rollover on PC.  Seem to be missing text captions.

Captivate 6Get Browser not support content warning. Once get there, works clean.  All text captions and audio

Kindle Fire HTML
Articulate Storyline  Works cleanly - better than the HTML5 version.  Resizes correctly and get all markers and text captions.

Captivate 6Works clean

iPad HTML5
Articulate Storyline  Works.  Get Play button before start.  Can click on Marker to see what happens next.  Is a rollover on PC.  Seem to be missing text captions.

Captivate 6Works clean

iPad HTML
Articulate Storyline  Does not work without Articulate Mobile Player app

Captivate 6Will not find the page at all


Android HTML5
Articulate Storyline  Takes forever to load. The menu takes up most of the screen real-estate.  Baseline tutorial never loaded.

Captivate 6Get “no support” message. Baseline tutorial never loaded.

Android HTML5
Articulate Storyline  - Will not find the page at all

Captivate 6Kicks to “get flash player” app but my device won’t work with it - then nothing.  Phone used not compatible with the Flash Player app needed.


SkillPort Publication
Articulate Storyline
  • SCORM 2004 publish to SkillPort SCORM 2004 - received error.  Null reading on a file
  • SCORM 1.2 publish to SkillPort SCORM 1.2 publish - clean.  Works as expected.  VERY fast load time to SkillPort.

Captivate 6
  • Works the same way as Captivate 5.5.

------------------------------
I have asked Sid and a couple other end users to look at the Articulate Storyline trial.  I figure they would give me better input regarding usability.  I will use their feedback to help me with my final decision.

Of course - this whole thing brings up the other question of - do I want to have "one tool to rule them all"? Or should I come up with another way of doing this.