Have you ever forgotten to take your medicine? Did you notice the effects immediately, or was it after some time that you realized the cumulative effect? We’ve all been in that situation at one time or another: there was something you should have done (but didn’t) which would have made a big difference if you had.
I don’t spend a lot of time training large groups directly, but every once in a while I find myself at the front of a lecture room providing a formal session on some technology. I had this opportunity recently, and this was the best part of the whole deal: one individual told me afterward that the presentation helped “make it all make sense.” She wanted to see the whole thing again. So I made sure she was able to attend the repeat session I did some time later.
So what type of sessions were these? Were they on some emerging technology — something so cutting edge very few had heard of it? Perhaps something proprietary I had worked on, being let out of the bag for the first time? Nope. It was a brief presentation on core concepts (that haven’t changed in a decade) — things each of the individuals used on their computers every day but had never understood. Coming out of one of these sessions, it’s not unusual for me to hear, “I wish I would have known this years ago.”
Now here’s the bummer out of the whole deal: these presentations weren’t well attended. Yet, less than 24 hours later I was in a consulting session with several individuals who, having had the opportunity to attend the previous day’s training but declined, displayed (by their repeated questions) a total lack of understanding these core concepts. From where I sat, all I could think was, “You’re making this so hard on yourself. You’re struggling to get work done because you’re not understanding what you’re working with.”
So therein lies a challenge. The lightbulb turning on for technology users/consumers is valuable. Core concepts are worth explaining. They provide the foundation for informed, empowered, confident users who go on to get things done. It’s challenging to get users to realize that they’re missing something that could make their lives a lot easier, provided they’re willing to invest a little learning time.
Further, when users are managed by others (like in a corporate setting) and money or time is on the line, the “watts” spent to turn on that lightbulb for users may not be perceived as money well spent by their management. I firmly believe this is shortsighted thinking. There can be a cost/benefit balance between “we’re too rushed to understand anything” and “we should have days or weeks of training on this.” So these days, it appears the easy thing to do is get a quick spoon-feeding — perhaps a quick procedure list or job aid — and rush on with business… the band-aid approach.
I’m curious. Do you think we’re such a “shortcut” based culture that foundational learning (for adults) is deemed irrelevant? Is this something amplified in technology spaces due to the common use of abbreviated quick-start guides, books promising you can “Teach Yourself X in 21 days,” and a market that always embraces faster and newer as good?
Lastly, is there a doctor in the house who can shed light on how physicians get patients to take what’s best for them? I know the core concept medicine works. I’ve seen it happen again and again, and it doesn’t necessarily cost that much (though it must be administered well). How do you get the patients to take the medicine?