Published On: July 10, 2018By
PODCAST: How does collaborative learning improve neonatal health process improvement? Listen to The Talented Learning Show!

WELCOME TO EPISODE 12 OF THE TALENTED LEARNING SHOW!

To learn more about this podcast series or to see the full collection of episodes visit The Talented Learning Show main page.


 

EPISODE 12 – TOPIC SUMMARY AND GUESTS:

Vermont Oxford Network (VON) is a nonprofit collective of multidisciplinary medical professionals, working together to improve the landscape of neonatal healthcare. Founded in 1988, VON has evolved into a community of practice for clinicians from more than 1000 neonatal intensive care units and level I and II hospital nurseries around the world.

What can this organization teach the rest of us about effective social learning strategiesJoin me as I explore this topic with two of VON’s program leaders:


 

KEY TAKEAWAYS:

The VON collaborative learning network elevates the practice of neonatal medicine through ongoing research, evidence-based knowledge sharing and quality improvement.

This kind of collaborative approach presents unique communication, procedural and technology challenges that require flexible learning strategies and continuous feedback loops.

Collaborative learning has tremendous potential to add value in other industries where professional expertise is limited and geographically dispersed.


 

Q&A HIGHLIGHTS:

JL: Welcome, Denise and John! How would you describe the mission of Vermont Oxford Network (VON)?

DZ: Our mission is to improve the quality and safety of medical care for newborn infants and their families through a coordinated program of research, education and quality improvement projects.

JL: Great.  Let’s focus on that education pillar.  What is your scope and who is your audience?
Join the live webinar: Learning in Hard-to-Reach Places - May 30th 1-2pm ET

RSVP FOR THIS LIVE MAY WEBINAR!

DZ: We facilitate education and quality improvement collaboratives for interdisciplinary teams of healthcare clinicians you would find in a neonatal environment.  These teams focus on infants who require very intensive care because they’ve been born prematurely or are facing serious challenges.

JL: Interesting.  How does that work?

DZ: We have two types of collaboratives.  From one, we collect and analyze data that helps us identify opportunities to improve newborn clinical care.  In addition, we have a strictly Internet-based collaborative where we define, deliver and evaluate quality improvement with a very intensive process.

For both teams, we use quality improvement science and methods to identify where they might be able to improve care.  In addition, we provide tools, resources and evidenced-based practices to help them achieve that.

JL: How do these two audiences differ in their learning requirements?

JM: The collaboratives are actually quite different in how they learn and the types of learning we provide.

1)  One benefits from more traditional online learning.  We provide a targeted set of lessons they work through.  When they earn credit, they notify their supervisors, so their centers can achieve their goal of having the desired number of learners with that universal education.

2)  The other type of collaborative is much more communicative and interactive.  They work together to gain knowledge based on a particular sequence of steps to develop an abstract or a learning improvement project, they’ll continually modify again and again, over time.

JL: So the second collaborative requires a more innovative approach?

JM: Yes. Traditional online learning doesn’t fully meet their needs. It fits when they have a particular goal and they’re trying to gain some background and understanding of prior knowledge. Online lessons can help with that. But viewing lessons for credit is only one aspect of what they ultimately need to accomplish.

So we integrated different types of scaffolded learning to segment the content so they can focus on specific topics as work through a particular project and determine how they should best teach that to their users. So the learning experience has to be flexible, yet very targeted and scaffolded at the same time.

JL: What do you mean by “scaffolded”?

JM: Scaffolding is basically segmenting. It helps when people are learning together, and you want to be sure they focus on one task at a time in a progressive series of steps. We chose this method because it’s often easier to learn something complex when you break it up into smaller segments.

JL: How did that work?

JM: It definitely needs more planning and takes more time, but it gives learners space for self-reflection and helps them gain more confidence as they’re learning. That’s important in these environments, where teams are continuously testing and changing things.

JL: It sounds like you need serious executive-level buy-in from your members. How do you accomplish that?

DZ: We started with the data set.  We wanted to understand the practices teams engaged in and the critical outcomes they were seeing by the time a baby went home or was transferred to another center.  Centers agreed it would be helpful if they knew what was going on with care across their whole community. So we collected a small, critical set of anonymous data that compares apples with apples. We share this an annual report that indicates where each center performs relative to the aggregate of all others who are doing the same work.  That established the goodwill.

JL: Brilliant…

DZ: And that’s how the network was built. From there, it was a natural step for organizations to say, “We’re not doing as well as our colleagues in this particular area.  We should try to see what we could do to improve. If we want to help our partners identify opportunities for improvement, we should invest in giving them some resources to do that.”  So that’s what led to us offering learning capabilities.

JL: Interesting. And how exactly does the quality improvement methodology work?

DZ: Healthcare organizations are designed to making changes in three ways:

1)  The research method – where you gather a lot of data ahead of time and administer a well-designed intervention to a specific group in a study.  Then you assess your findings and hopefully that will guide your decisions about how to improve your practices.

2)  The other approach is more top-down – where an interdisciplinary group of professionals gets together, reviews some evidence and reaches a consensus about a new standard of practice.  Then this is rolled-out more universally in a large-scale implementation.

3)  Where it’s appropriate, quality improvement is a third option. In this scenario, the group may say, “We have an idea what might be helpful and we have some evidence for it.  So let’s just start making small changes.  Instead of investing in a gargantuan project, let’s do one very specific thing and see if that makes a difference.  Let’s study it, maybe just for a week or so or with a few infants or families.  Then let’s assess the outcomes and move on to additional changes.”

WEBINAR REPLAY: How to Simplify Compliance Training - Strategies for Modern Learning Managers

REPLAY THE WEBINAR NOW

The third approach is the foundation of our network – creating the loop between submitting your data, seeing where you stand relative to colleagues, and then offering an opportunity to participate in a quality improvement collaborative with experts who can help you apply methods to make desired changes.

JL: I’m curious – what role does technology play in this collaborative learning process?

JM: Well, we leverage our learning management system, YourMembership, as a learning platform. About three years ago we followed more of a traditional format. But focus group feedback helped us realize that a traditional learning model just wasn’t aligned with the collaborative work that groups need to do.

So instead of trying to fix this as a single massive problem, we focused on small incremental improvements. For example, we deconstructed a particular project into key steps a team needed to complete.  Then we used our learning management system and other technology to make that content more interactive and accessible.  Also, we integrated various examples teams had previously worked on, so they would have ideas about how to keep the big picture in mind while moving through each step.  Plus, we gave them specific exercises, examples and lessons focused on doing each step well.

JL: So you’re embracing continuous improvement with your learning program improvements…

JM: Next, we kept the same model, but expanded the scope a bit.  Then we said, “Okay, now you’ll be going through step 1, 2 and 3…all the way to step 11.”  So each collaborative focused on the deliverable for each step, based on their own culture and the product they wanted to improve.  Ultimately, each of those steps helped them build the target deliverable, which was an abstract.

JL: Did you see improvement?

JM: We added value to the learning process by meeting with the teams, listening to their input, aligning with their goals and keeping it simple.  One thing we learned at the beginning was that we were overwhelming teams by trying to give them everything we thought could be beneficial, including the learning system.  But they need to focus on a particular goal, and if a lot of things are coming at them that don’t move them forward, it can be overwhelming.  People can tune out or shut down because they don’t see a clear path.

JL: Interesting. For others who would like to develop this kind of program, what advice would you share?

JM: The best thing learning professionals can do is to set a good example of what it means to be a strong learner by making sure we’re listening to learners and aligning with their needs. Here’s one small but important example from our world:  by listening to learners, we discovered the importance of making content printable.

From a technology standpoint, that might seem like a step back.  So we asked, “Why is making something printable good for you?”  Well, many of them said that, with everything else learners have to do at work, they have very limited time to plan for team meetings.  Sometimes they don’t have an internet connection, or a projector, or they don’t all have a laptop with them.  In other words, they need to be flexible, and printed content gives them another option.

The only way we discovered that printing matters was by listening to our learners. So my best advice is to be sure you understand the need and integrate tools that make sense, instead of trying to build something completely new and cool that may actually miss the mark.

 

FOR MORE QUESTIONS AND DETAILED ANSWERS, LISTEN TO THE FULL PODCAST NOW…


 

LISTEN TO MORE PODCASTS!

If you haven’t already subscribed to The Talented Learning Show, you can tune-in now with whatever method you prefer:

Thanks for listening!


WANT TO LEARN MORE? REPLAY THIS WEBINAR:

How to Capture Lifelong Learners: A Holistic Approach to Continuing Education

WEBINAR REPLAY: How can you support lifelong learning, as a continuing education provider? Replay this expert panel discussion led by independent learning tech analyst, John LehContinuing education can be a lonely experience. Many of us must rely on ourselves to identify credible training sources, choose and consume content, earn certifications and demonstrate our value in the marketplace. But it doesn’t have to be that way.

How can continuing education providers make it easier for professionals to connect with the right resources and navigate through the lifelong learning process?

Find out from our panel of experts:

  • John Leh – CEO and Lead Analyst – Talented Learning
  • Tamer Ali – SVP Education – Community Brands
  • Jacob B. Gold, CAE – Director, Education Development – Community Associations Institute
  • Kevin Pierce, MAT – Manager, Digital Learning – American Academy of Dermatology

You’ll discover:

  • Why and how to create a lifelong competency model
  • How to support self-guided and directed content paths
  • How AI helps enhance content recommendations and analyze results
  • The value of digital badges and credentialing
  • Pricing methods that lock-in long-term subscribers

REPLAY NOW!

 


Need Proven LMS Selection Guidance?

Looking for a learning platform that truly fits your organization’s needs?  We’re here to help!  Submit the form below to schedule a free preliminary consultation at your convenience.

[gravityform id=”18″ title=”false” description=”false”]

About the Author: John Leh

John Leh is Founder, CEO and Lead Analyst at Talented Learning and the Talented Learning Center. John is a fiercely independent consultant, blogger, podcaster, speaker and educator who helps organizations select and implement learning technology strategies, primarily for extended enterprise applications. His advice is based upon more than 25+years of learning-tech industry experience, serving as a trusted LMS selection and sales adviser to hundreds of learning organizations with a total technology spend of more than $100+ million and growing. John would love to connect with you on Twitter or on LinkedIn.
Talented Learning Show Podcast

EXPERT HELP FOR LMS BUYERS

Get personal independent LMS advice from expert John Leh

SUBSCRIBE TO OUR NEWSLETTER

POPULAR CONTENT

Customer and Partner Training: How to Succeed with Asynchronous Learning. Webinar Replay.
6 Things that Will Kill Your LMS and 3 Companies that did it right. Webinar Replay.

LMS BUYER RESOURCES

Talented Learning Case Study Directory