Friday, February 15, 2013

Data is meaningless out of context: thoughts on the Tin Can API (xAPI)


Just posted a comment on Dr. Eric Fox's blog post from last fall on his thoughts on the Tin Can API:  A Behavioral Scientist's Initial Thoughts on the Tin Can API, Big Data and Learning Analytics. If you haven't read it, you should. My comment to him is re-posted below. Let's keep the conversation going. 



Data is meaningless out of context
Hi Eric,
GREAT post and happy to see you've delved into one of my biggest issues with the Tin Can API. Ironically, someone pointed out this post to me, having just left ASTD's TechKnowledge conference where there was continued buzz and interest. As a immersive learning designer and someone who has struggled with how to quantify practice into meaningful data, I am thrilled that there maybe an emerging standard that could help capture that data. The problem, as you correctly point out, is that reporting activity neither demonstrates learning nor performance improvement.
The hard work is in establishing actual performance metrics and measuring improvement through various learning activities. Simply reporting that you did something doesn't show qualitatively OR quantitatively whether that activity has any impact on what you know, or what you can do better.
I see potential here, but it troubles me that people are too "oooh! shiny!" about one minor piece of a much bigger piece of work, namely, correlating activity to performance. There are A LOT of potential problems with the Tin Can API, and so far, I haven't seen any best practices, use cases, or case studies that demonstrate its most effective use. Another problem? Those directly involved in the creation of the Tin Can API jumping into every discussion and squashing the much needed conversation among the rest of the community. Much like a community manager who tries to force conversations in a certain direction, there has been much talk in the industry about our inability to have deep conversations about the pros and cons of the Tin Can API without the developers and evangelists inserting themselves and trying to guide the conversations. Like pushy sales people, they are becoming a turn off for people who want to research, investigate and share their own conclusions. If I want to hear the sales pitch, I'll talk to a sales person. If I want to talk to my peers, I go to social media. Unfortunately, the sales people have hijacked these social conversations and are creating an atmosphere where no one wants to participate.
It reminds me too keenly of my experience with virtual worlds...the developers so in love with what they had built that they stopped listening to the concerns, needs, and objections of consumers and their potential customers. When you build something, you sometimes get too close to it to be able to see the chinks in the armor. When developers start arguing with naysayers, it's a clear sign that it's time for them to let go. Like an artist, you can have intention with your work, but the true meaning is what every person brings to it. It may be time for the developers to step down, and let practitioners lead the next phase of the discussion to help the Tin Can API survive its inevitable fall into the "trough of disillusionment."
Let's hope the conversation continues and that the real problems with the Tin Can API are not ignored. No system is perfect, but ignoring the problems, or trying to squash the conversation about them, does not make them go away. I'm hopeful that we can all learn from each other in making the Tin Can API useful and meaningful.

42 comments:

  1. Hi, Koreen!

    I see where you're coming from but it seems like some (not all) of the counters seem to be coming from an adversarial position concerning a technology that:

    1) Isn't finished yet.
    2) Isn't really "the thing" it's just the underlying protocol to enable "the thing"

    Totally agree that there are a lot of potential problems with the *use / employment* of the Tin Can API, not unlike any technology standard. Some are innocuous (collecting data you'll never use... sort of like the LMS today) and others are potentially nefarious (privacy issues with exposure of puzzle pieces). These are symptoms of something that isn't necessarily caused by Tin Can itself. Still concerns and I'd love to see a discussion that surfaces all of the problems and explores ways to form those best practices you're talking about.

    I'm not sure the failure of these to surface in a discussion is squashed entirely by the tech folks trying to nurture the monster to life:) Sure, in some cases, there is clear ugliness but most of it is just helpful energy.

    "There are A LOT of potential problems with the Tin Can API, and so far, I haven't seen any best practices, use cases, or case studies that demonstrate its most effective use."

    For a split second I took this personally. It passed quickly:) I'm one of the folks that published some forecast use cases. A few that I think are directly tied to performance metrics -- mine was a bit jargon-y, of course, as it was targeted for internal folks. And these cases are likely more inline with the designed intent of the API, as we're military. Harder to do with softer data in the corporate environment.

    http://androidgogy.com/2012/12/11/tech-people-and-systems/

    Data quality is a serious concern. Reporting is a serious concern. Folks already ask for plenty of data to be captured in the LMS and that data is rarely employed usefully to improve business performance. Is that because the data isn't useful? Maybe. Good questions to ask.

    I'm a tech guy and I love the potential of the Tin Can API but there's no reason for us to be on opposite teams:) Thanks for bringing this up, let's try to keep it going publicly -- there are a few blogs out there that have published similar questions. Maybe a blog carnival to surface big questions would be useful?

    ReplyDelete
    Replies
    1. I wrote a whole reply and Blogger ate it...trying again :)

      First, I hadn't seen the forecast use cases! Thanks for posting them...good stuff! Next steps (you know what I'm going to say) is to get people poking holes and getting their hands dirty...knowing that the spec isn't done and that there will be holes without even poking.

      My bigger issue is that there is currently a dynamic where if someone asks a challenging question, the response from the people closest to the spec is swift and challenging back, often with an overtone of "why are you even asking this?" or "you must not know enough about what we're doing, let me better inform you." It's creating an atmosphere where people with the most experience in the learning industry are having side (not public) conversations to avoid interactions with the people who need to hear their feedback the most.

      I'd rather not see people dismissing the potential of Tin Can API because of the current community climate around it. Throwing out the baby with the bathwater means we'd never get to see that kid grow up. I want to see the Tin Can API grow up...and part of that process is its parents letting go and seeing what happens :)

      Delete
    2. Absolutely. We really should want people to poke holes in concepts before they are fielded. How much does it suck when hypotheses don't hold up? Lots:)

      Delete
    3. btw, love your idea of a blog carnival...would be interested to see who would participate in the broader learning community!

      Delete
  2. Hello Koreen, thanks for posting this and sharing your thoughts. Based on recent Twitter discussions, it is clear that it may not be the best venue for deep musings and extensive Tin Can debate :)

    Following along and participating in some of the conversation, it wasn't clear to me that there was such a divided line or squashing of ideas between practitioners and developers.

    Additionally, I don't think its fair to draw that line...example: I am part developer (UI/front end design), part practitioner (experience in Sales Training, Learning & Development) and there are others out there like me. So am I a sales person, a developer, or a practitioner? What side should I take?

    I fear we are all jumping to early conclusions about technology, design, and Tin Can in the form of 140 characters without clearly understanding others in the same community.

    I agree that there must be open dialogue, and a lot of times that comes with passionate debate but that shouldn't be confused with forceful squashing of ideas or hijacking of social media.

    ReplyDelete
    Replies
    1. I'd rather there not be sides :) I was really referring to "developers" as those working on the spec for Tin Can API, not developers in general. It sounds like you're much like me, a designer and a practitioner, and someone who isn't directly involved in developing the Tin Can API. I have questions, not for the developers of the Tin Can API, but for the larger community. If those organic questions and conversations can't flow among the larger community, there will continue to feel like there is tension and a divide between us and "them."

      Delete
    2. No, we are highly involved as major contributors to the Tin Can API, at Saltbox (a unique bunch to say the least). I see your point, but I feel it's still important to involve the dev community...practitioners should be demanding what they need and driving the overall direction, and doing less technical analysis of the spec (privacy, extensibility, what it enables, etc).

      More of this --> "As a learning & development professional, I want to ensure that data about my learners activities are owned by the learners so that they have more control of their achievements and so it is not abused."

      Less of this -->"As a learning & development professional, I think the Tin Can API lacks privacy and will be abused by the evil empire that I work at."

      Delete
    3. Respectfully disagree on two of your points...

      As a practitioner who is recommending the spec to my large global employer, I have every right to do in depth technical analysis of the spec regarding privacy, extensibility, etc. It is my job, and our company, on the line. We have global privacy issues that are real and serious and will prevent us from implementing this (or any) spec if they are not adherent to what we need for our customers. We also have short and long term roadmaps that will determine the usefulness of an investment in any spec, and extensibility is a critical decision-making factor in how we proceed toward adoption. We didn't just fall off the turnip truck; we have broad and deep technical expertise in our business, and our job is to challenge whether or not the Tin Can API can meet the needs of our business.

      And second, if you want feedback, you can't dictate the way it is communicated. Both of your examples are valid feedback, one just elicits a defensive response. As I tell my kids, you have to look at the underlying meaning of communication and not just react to words you don't want to hear :)

      Delete
    4. I was using the privacy issue as an example we are all familiar with. On the issue, there is an IT department to help with that so it would be best to consult with them (and the Tin Can dev community). There is nothing wrong with raising the concern or demanding clarity. Privacy concerns are real, no question there, and there needs to be deep analysis, but is L&D best fit to do an in-depth technical analysis? Should they be? I don't think so, but I may be wrong...just drawing on my own experience :)

      On the second piece...Sorry, I really didn't intend to dictate style (it was really meant to be a recommendation), I just think it is more productive to state the needs vs accuse a technology standard without enough information as Steve and EdCetra have shared. I think its less about meaning or words and more about clarity in describing a practitioner's concerns and needs.

      We're all big kids, but there is always a more productive way to discuss/share concerns and needs.

      Delete
    5. Totally agree on etiquette, but feedback comes in many ways...

      And maybe I'm unique (but I'm not sure that's true), but I'm a product manager at a mid-sized, quickly growing global learning company, and my focus is on our organizational learning customers, which means that I actually DO need to make the decisions on privacy, security, integration, extensibility, and technical specs. I need to make decisions on everything from browser support to data tracking, reports to error messaging to what is supported on mobile devices. Am I an IT person? No. Do I need to make strategic decisions for my organization on very technical issues. Absolutely. Just because I'm a designer and a learning & development professional does not disqualify my need to do a technical analysis to qualify options before we make an investment. Nor does it mean that my concerns should be dismissed because I'm not technical enough. Ultimately, I'm spending the money. If you can't convince me, then I'll spend that money elsewhere.

      Delete
    6. As a Product Manager, you don't represent the majority of learning professionals (and you work for a learning technology company), there are highly varied levels of technical understanding of the Tin Can standard. Every practitioner has every right to be a part of (and drive the need for) technical analysis but I would argue that some form of consultation is beneficial to the organization for 8/10 practitioners. What we are seeing today on twitter, etc, as you have described in your original post is almost always random blame and uninformed speculation.

      I think it would be nice to have some kind of formal environment (other than a blog) for sharing concerns/questions/needs...I'm thinking a place where a practitioner posts a need and others offer potential solutions, kind of like what's happening below which I think is very productive. Thanks for the discussion!

      Delete
  3. Hmmm...as one of those people that likes to hijack conversations I think the biggest problem with the xAPI is that people are attributing problems to the spec/standard itself, where really there is no problem inherent in the spec.
    The problem is:
    a) People's perception of what it does
    b) Lack of understanding about the spec and its uses
    c) An industry completely under qualified in data science

    I'm not sure the xAPI sales people are trying to sell it, I do think the messaging coming from the community however needs to stop because unfortunately the essence of what the spec does is being completely bastardized and its nearly impossible to actually understand what it is. Here's a great example. This is from Craig Weiss, an alleged expert in this stuff:
    "Tin Can API – Better known as Tin Can. Communicates instances between a system and a device. Right now works with mobile devices. Could in the future work with XBox 360, SMART TV and other items of that nature."
    ------
    What the hell does that mean? Its total garbage but if thats the source of information and you want to have a conversation about it, then it behooves the folks that brought us TinCan to intervene into that conversation to say your talking about the wrong stuff.

    At the end of the day, SCORM had no problems either. There may have been issues trying to use SCORM to achieve something specific, but thats not a 'problem' with SCORM. The troubling part about all these conversations about the issues with TinCan, is that no one is actually saying 'I've tried to achieve this and I can't do it'. The 'problems' that people attribute to TinCan have nothing to do with the spec itself. So if you want to have conversations about the limitations or implementation hurdles for TinCan thats fine. But don't go assigning issues to the spec itself.

    ReplyDelete
    Replies
    1. I have no problem with the spec...I do have problems with the current atmosphere in the learning community as people are trying to work through understanding it. Is what Craig saying about Tin Can API wrong? I don't actually know that. It seems like he's giving concrete examples of technologies that could be used with the spec. Maybe that's an over-simplification, and certainly in this quote, out of context, he's not addressing the how or why. Does that mean he's not an expert? I don't know that.

      I DO want to have conversations about the limitations or implementation hurdles. That's exactly my point. No one wants to engage in those conversations in an atmosphere where they don't feel like their questions and complaints can be worked through respectfully. Many of us have been in this game a LONG time. This is not the first spec to come along. It may be a great one...but that doesn't mean we shouldn't be able to challenge its value (if not the spec itself).

      Delete
  4. Another note...here are some great existing ways for practitioners, developers, evangelists and sales people to engage and work together, today, open, not vendor associated:

    -Tin Can API Adopters Group (https://groups.google.com/a/adlnet.gov/forum/#!forum/tincanapi-adopters)
    -Tin Can API Specification Group (https://groups.google.com/a/adlnet.gov/forum/#!forum/tincanapi-spec)
    -Weekly Tin Can API Calls (listed on the groups above)
    -Wikispaces, ADL (http://tincanapi.wikispaces.com/TinCan+Use+Cases)

    All great ways to put ideas on the table and work through them as a community. I think people will be surprised how helpful these resources can be.

    ReplyDelete
  5. I think all of your comments are variations on a theme, so I'll address them together (or maybe I'll do break out responses, too...).

    I am actually VERY excited about the Tin Can API.

    I don't think there is anything wrong with the technology, or the standard, inherently. Or, at least, I don't know if there is yet. I think the reason I don't know if there is yet is because whenever I've seen people raise a question, or challenge the potential, there's been a swift and decisive response from those closely involved with the API. Sometimes I've felt those responses have been dismissive, sometimes patronizing and sometimes so defensive as to seem like attacks.

    The truth is, I see amazing potential for design, and flexibility for capturing and aggregating data in ways we haven't been able to do before. The bigger questions I have are: so what? do we need more data? what does this data mean? is it meaningful for the work we're doing? If we can answer those questions, show examples, and engage a broader audience in ongoing questions and answers, I think the Tin Can API could change our industry. But we need to have open dialogue among practitioners...not the developers...and find ways that we can learn from each other without perpetuating this current atmosphere that anytime anyone challenges the value or questions the implementation, that they are not shut down. The naysayers and skeptics should be allowed to speak, and we all should learn from their questions, as well as the process of finding the answers.

    ReplyDelete
  6. Hi Koreen

    I think I could possibly understand why some of the people closely involved with xAPI may feel defensive and as a result compelled to leap in to conversations.

    Not all the commentators have been as balanced as you in their handling. A great deal of criticism has been levelled at the protocol for issues that are nothing to do with it - tracking activity and experience will not be appropriate in all situations, but just because the xAPI does a better job of doing it, doesn't mean it is in itself to blame if it is misused.

    So if some of those involved come out to defend their standard against misguided (if well intentioned) criticism, I don't think it's unfair.

    Equally, some developers are seeking to engage with misconceptions about the standard, in the way that Andrew Downes, who has been involved with the spec it would seem, had interceded in the comments from which this post originated. His response to a reasoned critique on the issue of self reporting - problems with which can clearly be seen in the case of the Tappestry mobile app for example - was to point out that far from having been an oversight on the part of the developers, it had been anticipated and a possible solution already built in. It would take a steely cool NOT to feel impelled to respond to a criticism you had already considered and taken steps to address.

    Which is not to say that all involvement by the developers has been benign - there have been some very uncool actions by some individuals that are wholly regrettable. But let's not begrudge developers engaging with their user community - under most circumstances, that can only be a good thing, surely?

    ReplyDelete
    Replies
    1. I totally agree, Dan...seems like what should be reasonable discussions have turned emotional, and that in turn makes them unproductive. I think the balance of education on the Tin Can API by developers and reaction to questions or comments (well worded or not so much) will hopefully begin to even out as more and more people are asking the same questions.

      What is unfortunately is left out of many of these discussions, and what I hope to hear much more of, are the business problems that the Tin Can API can help to solve. For example, I work in an organization where we currently do not allow our content to be hosted on an LMS, and therefore have obstacles with SCORM for reporting activity data. It seems that Tin Can can solve this issue. That is an actual direct business issue and limitation of SCORM. Those are the types of use cases I'd love to see shared. (And gee, I hope I'm correct on this particular use case!)

      Delete
  7. First: Great comment thread, and thank you for having the metaphorical balls to put this out there knowing the current climate and expected backlash. I completely agree that there is a lack of openness regarding Tin Can API discussions. I see too many discussions that tread the line of right and wrong instead of shared understanding.

    I don't have a problem with the spec either, and fully support anything that's going to help us be more effective in our work. And it's that second piece that's highly lacking in the conversations, and becomes adversarial when you try to discuss it: How will this help us be more effective?

    I think part of the confusion is that those developing the spec are speaking to designers as if they were developers. As a contrast, let's go back to when the iPhone was coming out, and touch screen interfaces were about to change the game. Developers are concerned with how the device will interpret a finger touching and moving across the screen as input. Designers just want to know how that the touchscreen works and enables them to build something like Fruit Ninja, which wasn't really possible before.

    I understand the spec is being built and is still evolving. I also understand that the building of the spec is separate from the usage and implementation of it. But the message from the development community is consistently "you need to care". If people are going to care, they need to understand, and for them to be able to understand, it needs to be discussed in a way that has context.

    ReplyDelete
    Replies
    1. Context! Yes! That was actually what I was getting at in my reply to Dan. I need to know the technical aspects, yes...but more importantly, I need to know how to use Tin Can to solve my current problems. I don't dislike SCORM; it just wasn't built for what I want and need it to do.

      What would be EXCELLENT is for us as a learning community to start collecting those issues: I want to do X. How can Tin Can help me do this?

      I think posing those scenarios would be an excellent way of engaging the Tin Can dev community to start building the value of the spec in context.

      Delete
  8. I'm a designer first and a novice developer at best. I *get* TinCan and I'm excited about its evolution. Like Koreen, I don't dislike SCORM, but it has it limitations.

    Developers of the spec (some good friends of mine) are answering the questions from their perspective - "Yes, and heres' why." And they're great answers. Yet, the 'here's why' part glazes over most folks because they need to hear the 'here's why' in the context of their world - which I think developers miss the cue.

    The climate around TinCan is at a point it's time to push it over the edge to the next level of conversation. We've beat this horse enough. Perhaps when folks start asking "Why should I care?", the follow-up question is "What are you doing today that you can't but want to in terms of making an impact on your org's performance levels?"

    Mike Rustici said it best recently, [paraphrasing] "We're in a box currently. TinCan is the birth of a new technology that will change things in the next 10 years. Get excited about that! But, in 10 years (if not sooner) we'll be in a new box and will change again."

    To follow Koreen's lead on collecting issues of, "I wan to do X..." here are some off-the-cuff thoughts. Whether they're real problems or not, I think this may be the missing piece the conversation needs. If TinCan can help with these, how? If not, why?

    1. I have a fleet of 4,000 vehicles. Brakes are replace once a year. However, over 300 are being replaced 2-3 times a year. I need to know if those vehicles are in fact faulty, or do I have a driving performance problem with drivers riding their foot on the brake pedal? Can TinCan help me with that?

    2. I run a retail operation. Cashiers conduct several different types of transactions. A particular transaction is always being voided or corrected. I need to know if it's a usability design in the Point of Sale software or is it a training issue I need to address. Can TinCan help me with that?

    3. I'm the director of nursing for a large hospital. We are constantly evolving our processes. Can TinCan help me with process improvement metrics?

    4. I'm the VP of a large bank with several branch offices. While most of our transactions are automated and digital, some transactions are still managed in person with the customer. Human error is common. I need to isolate those gaps. Can TinCan help me with that?

    5. I'm a AP Algebra teacher. Most of my kids do fine but sometimes the lessons become too difficult for them to keep up. I want to teach them applied algebra in real-world problems to solve. Can I measure and compare the effectiveness with TinCan?

    ReplyDelete
    Replies
    1. Kevin, this is excellent. Love the specific questions! Now, who can answer them? and also mine...I don't want to host content on an LMS but I'd like the activity data recorded from my site to be sent to the LMS for reporting purposes. Can TinCan do that?

      Delete
    2. I think I may actually know the answer to that question. Your activity data would be housed/stored in a Learning Record Store (LRS). You can do whatever you want with the data from there such as sent to an LMS for reporting.

      Delete
    3. Hey, Koreen -

      Depending on your configuration, YES. In the case of Lynda.com -- let's say you want to:

      1) Have folks link into your catalog from their own site.
      2) Have the user's own (company) site pass their individual authentication info and an LRS endpoint configuration to Lynda.com
      3) Have the course launch from Lynda.com after they authenticate in
      4) Have the participant finish the course at Lynda.com
      5) Have the course pass the completion back to the LRS endpoint (at their company)

      So the content launches from Lynda.com > passes completion back to the company. Totally what the spec was designed for.

      Lots of assumptions here. To make this work, the company needs to have an LRS or have access to an SaaS LRS like Saltbox. Is that kind of what you were thinking?

      Delete
    4. This was my understanding, too :) Appreciate the confirmation that I'm on the right track!

      Delete
    5. Hi folks,
      Great that you're coming up with specific questions and thoughts about what Tin Can can do. I'll see what I can do to answer some of these, but I'm off to bed soon. My website, http://www.tincanapi.co.uk will also help. The links under "Epic articles" relate mostly to learning design. I'd also recommend that you guys read the actual Tin Can specification itself. If sections 1 to 5 are too technical for you guys to follow, then the spec needs to be made clearer. Please flag up anything you don't understand.

      Koreen - yes one major feature of Tin Can is being able to track any learning anywhere, nit just stuff in the LMS.

      Kevin, I'll take your examples one by one in subsequent posts.

      Andrew

      Delete
    6. Hi,
      I posted another couple of replies responding directly to questions people had but I don't think they have come through. Let me know if I need to re-write.

      I also wanted to add that the Tin Can working group is not an organisation, but a group of interested people, many of whom are volunteers. I'd love to see more of us learning designers in the working group sharing the kinds of cases and ideas in this post. The LRS developers are quite well represented and whilst we are getting more and more learning designers joining the group, I'd love to see our numbers equal the developers.

      The best places to start getting involved are the google groups, the weekly call and the github. these are found here: http://www.adlnet.gov/capabilities/tla/experience-api and here:

      https://github.com/adlnet/xAPI-Spec

      If you can, please take time to read the spec on github, particularly sections 1-5. If anything does not make sense, raise it as an issue here ( https://github.com/adlnet/xAPI-Spec/issues/new ) and if possible, suggest a re-wording to make it clearer.

      Andrew

      Delete
    7. Hi Andrew,

      Great resources and thanks for the thoughtful responses! Really looking forward to seeing this conversation continue, and it looks like there are some good places where people can continue to talk. I would also like to see more designers involved, but I'd actually like to see more executives involved...while this is something that can be utilized by learning designers and developers, the ability for Tin Can to solve larger organizational issues has meaning beyond what we traditionally think of as the "learning industry." It would be great to find ways to speak to larger organizational needs to help arm learning professionals with business problems to solve, not just learning problems to solve.

      Delete
    8. Hi Kevin and Everyone,

      I've started a series of blog posts where I intend to tackle all (modulo how busy I get) of the scenarios raised in this thread, and more besides if people keep bringing interesting new ones up. The first post can be found at: http://blog.saltbox.com/blog/2013/02/18/how-tin-can-can-help-vehicle-fleet/

      I hope they prove helpful.

      Delete
    9. That's awesome, Russell. Look forward to reading all of your responses!

      Delete
  9. Okay -- I'll play a slightly different version -- things I think Tin Can should do, but would like to know when/where/how.

    - I'd like a learner to be able to save the data out of a learning experience that they think is relevant -- to a virtual notebook, or a file they can print or whatever. How can Tin Can help do that?

    - I'd like to know which questions learners get wrong, and if they favor particular wrong answers. How can Tin Can help do that?

    - I'd like learners to be able to try answering a question, and then be able to compare their answer to the answers other people gave, or check their choice against the aggregate. How can Tin Can help do that?

    - I'd like learners to help create the content, rather than just have pre-created content projected at them. How can Tin Can help do that?

    I could do this all day, but let's start with those :)

    ReplyDelete
    Replies
    1. Julie's are harder:( I could be totally wrong... But here's my understanding.

      - Depends on the type of data and the method of storage. Anything is possible but I don't think local / personal storage is what the xAPI was designed for. There's nothing saying you couldn't have an endpoint on your own device to store stuff offline. As long as you can get to an endpoint, a statement could carry the kind of data your talking about. Formatting and printing that data on cue wouldn't be a function of xAPI. Statements are just data, applications and frameworks would need to be built to do special stuff with it.

      - Sounds like an analytics requirement. Could xAPI capture answers? Yeah! But analytics / reporting isn't part of the spec. This is one of those weaknesses folks will point out but it's not a weakness of the spec, it's an indictment of the shitty tools we have to do analytics of data. Capturing data and making it useful are different things. Different problem:) xAPI solves the former, not the latter.

      - Displaying compared answers... I'm not sure Tin Can was built for that. This would be up to whatever is downstream of the LRS if it were possible. Could be another technology would better serve the need. I'm least confident about this answer, my gut says the data will be sandboxed and protected by default (your data is only for you and whoever has access to the database / reports).

      - My first answer was "xAPI won't help with content creation, I'm afraid." My second thought says "Hmm... now that could be interesting / possible." As long as the statements could hold the types of content creation "commands" you need to construct an experience, I suppose you "could" do some type of collaborative making using statements. The burden is on the LRS and how the LRS processes the statements. There could be better tech for content creation but I'm thinking you might be onto something there, Julie:)

      Delete
    2. Steve, I totally love that you're jumping in to tackle these! Would love to hear more about the analytics part, or what regarding reporting/analytics is/isn't part of the spec. If data is being captured, how might we start creating meaningful reporting? Perhaps this is part of the analytics education that Reuben was talking about...but if better understanding of analytics will help make the API successful, now might be a good time to start those conversations too.

      Ironically, that ties into my blogpost earlier this month...it's weighing heavily on my right now...

      Delete
    3. Analytics is my biggest concern with this whole "big data" thing. We're seeing chatter about big data inside government. Big data and cyber security as skill sets.

      The skill set is one thing. Seems to me, analytics is about tools to organize and expose nuances in the data to make it approachable with skills. Not a really great showing from tools in this area. I wouldn't completely fault vendors. Customers don't really know what they want. Most have no idea how to employ what those analytics might tell them, much less describe what the tools would look like to get them there.

      I'd love to see more discussion about the analytics piece and what a vision for "big data in context" would look like. There might be great analytics tools out there already for some contexts. What do they look like? Where are they? Where are they useful? Where are they useless?

      Delete
    4. Agree, you hit it on the head Steve. We really don't know what we need yet. There are some interesting statistical and predictive models that can be applied to analyze Tin Can data, but we need some more solid business cases. We've got some interesting things coming soon that we've been working on, after talking to a few hundred L&D folks over the past 9 months. Stuff will get more interesting but we need at least 2 things...more data (from multiple sources) and more feedback about what is needed.

      Delete
    5. Hi Julie, I've tackled your first question, about saving out experience data, as the (belated) 2nd post in my Tin Can Can series: http://blog.saltbox.com/blog/2013/04/02/how-tin-can-can-help-save-out-experience-data/

      Delete
  10. Koreen, thank you for calling the question about the value of data without context. I would even take it one step further and suggest that data themselves are meaningless until they are turned into actionable information.

    Most people do not understand how long it is going to take the learning world at large - people in training rooms, learning online, immersed in a simulation, learning in banks and on ships and in schools - to think of data as his or her friend. One might be surprised to know how many enterprises stop collecting learning data when they discover they can't afford the burden of knowing what those data reveal. And we are just getting started with this notion of data collection. The culture wars over this are going to be awe-inspiring.

    It's good that some spend their time exploring what the API can do. The rest of us will be exceedingly well served to be thinking about what to do with the data being collected can do to make a difference.

    ReplyDelete
    Replies
    1. "One might be surprised to know how many enterprises stop collecting learning data when they discover they can't afford the burden of knowing what those data reveal."

      Yes. This. We already see this a bit within our organization. The data is fine as long as it tells the story you want to hear. The second the data doesn't align with programmed reality, it's ignored or worse. Most organizations are like the Matrix (the first movie, not the other two:)) Agents of the organization don't appreciate stuff that doesn't belong.

      I think we'll see less of this (or this in a different flavor) for a short time as budget constraints change motivations and force behavior changes. Fun times.

      Delete
    2. I should say that this resistance to data reality isn't universal in our org. Our vocational / operational personnel and management are well conditioned to respond to real data, mostly. But this conditioning takes a long time...

      Hard work to start, hard work to maintain. Data doesn't collect or analyze itself, nor does data form it's own actionable recommendations. Lots of human work involved. This is where I hope better system integration will help. If we can at least not increase the FTE requirement with a significant increase in sight picture, I'd call it a win.

      Delete
    3. One good thing about the xAPI is that it is data neutral.

      One bad thing about the xAPI is that it is data neutral.

      :)

      The truth is that most orgs don't know what data to collect, what the data they collect means, and even if they do...are they willing to act on it?

      I have heard this theme a lot in discussions around the xAPI. I agree, Ellen...we'd do well to spend our time there first. But the xAPI is forcing our hand, and that worries me. Shouldn't we get the data right first before we implement the technology? What are the risks of the tech coming first?

      How do we move our industry towards smart analytics so that the xAPI is really useful? Who is leading that conversation?

      Delete
    4. One good thing about the xAPI is that it is data neutral.

      One bad thing about the xAPI is that it is data neutral.

      :)

      The truth is that most orgs don't know what data to collect, what the data they collect means, and even if they do...are they willing to act on it?

      I have heard this theme a lot in discussions around the xAPI. I agree, Ellen...we'd do well to spend our time there first. But the xAPI is forcing our hand, and that worries me. Shouldn't we get the data right first before we implement the technology? What are the risks of the tech coming first?

      How do we move our industry towards smart analytics so that the xAPI is really useful? Who is leading that conversation?

      Delete
  11. Here are my two cents.

    Under the current methods, i.e. SCORM and LMS technology, there is no chance to link performance to learning in any automated sense.

    Tin Can provides the potential for that linkage to occur. That is to say multiple systems can report data to a Learning Record Store. As a result, you could query the LRS for learning results from a LMS and performance results from a production system and determine correlation.

    This level of criticism of an emerging standard is curious to me.

    To me, this is not a case of the technology coming first. We have been measuring performance for a long time and, evaluating training - Tin Can provides a needed tool.

    I think where we have to be VERY careful is in the realm of collecting too much data and defining privacy issues.

    ReplyDelete