(Note: Got into the room a couple minutes late, so there might be a bit of intro missed)
Intro - daVinci Xi
Meg B presents a case study, the background specifically, for an app that they've created (centered on surgeons). In essence, the providers wanted to be able to review patient information (among other things) while waiting in line at Starbucks.
The Hurdles
Complex topic, Multiple versions and translations, Reuse of Content, Limitations of Print, Headache to Manage, and Users Asking for Mobile - All issues we're faced with. All of these can stand in the way of designing almost anything.
Once the problems were defined, Meg's group needed to identify benefits that they were looking for and align them with their available options. Our respective lists will, naturally, differ from the one she shows, but there are certain similarities likely across the board. Ultimately, Meg was looking to get the information a user needed to them succinctly and as timely as possible. Benefits included finding the right content, easily navigable, connection to processes, media focused, and easy to update,
They were mostly looking at tools that were 'container' apps - IOW, updates would be able to be applied quickly. They also looked at some custom options, definitely something they could develop without being a coder.
With hurdles and benefits identified, cost was analyzed. Each vendor had a different pricing structure, so it made it even more difficult (no standard). Similar to LMS purchase decisions, they wanted to make sure everything fit their needs from a financial, audience, and maintenance perspective. Two comparisons, then, had to be made:
Feature Set vs Critical Requirements
Cost vs Budget
How'd They Pitch It?
Despite trying to get it approved via pitches and roadshows, they had to develop a pilot to get that final approval. The organization went with MagPlus. In doing the pilot, they had to create a measurable result, showing that it actually worked. Things considered included Methods of Data Collection, Pilot Participant Survey Questions, and Pilot Participant Interview Questions.
(So, get your budget together and develop your pilot - See Mobile App Pilot Proposal image)
Other participants in the room share their pilot experiences...from elearning to LMS selection to app development, too - Everyone's been there, everyone's got a story. It's refreshing to hear others' woes while reflecting on our own.
What's Next?
Once they had the content and the tool it was going to be developed in, how do we organize it? How do we chunk it better? Should we let learners jump around between topics? Decisions were made and Meg (see: Team of One) was able to develop it in a couple of months. All coding/dev was done in InDesign.
One other struggle in the pilot process proved to be getting people to participate in it. Meg created a spreadsheet and reviewed employee schedules so that she could identify them and elicit their participation. This included scheduling them, presenting the pilot to them, allowing them time to utilize, scheduling their feedback, etc...there was a lot to it. Overall, this portion took 9-12 months.
(* Note - it was all available offline so it could be used in the surgical suite...talk about knowing your audience.)
Received some really good feedback from users - "Less paper to fumble through" came from one of their sales reps. They were able to show content to the customer without having to dig through documents. Met needs, Helpful images, Helpful video, Easy navigation, and a 100% rating that they would recommend to their colleagues = SUCCESS.
Folks around the room discuss feedback they've received from other pilots they've done. Mixed results, it seems - But at the core of asking for feedback is being willing, ready, and able to work on said feedback to further improve the content/product.
Yet Another Wrench...But Not Really a Wrench
Mid-pilot, there was a request to re-do the research on tools that would work to develop the app in question and, based on that, Meg got approval for the budget that would support one of the originally planned for options.
Another team came in and saw they were creating an app and wanted to add even more material (Content Marketing, etc.). Again, budget was provided to pay for a content developer. So, going back to the initial struggles, Meg and crew were able to develop what they were originally looking for.
Sharing the Content and Pulling It All Together
With the pilot going on, the tool needed to be decided on - There were two that did most of what they needed. Prioritization was key at this point as far as what needs should be addressed and addressed most urgently. While both options were significantly different, price points came in at about the same amount. One deciding factor was that the source files, assets, and articles would be reusable. Ultimately, they went with AEM (Adobe Experience Manager).
(Vendor issues, re: 'getting' daVinci persisted)
When Working With Vendors
1.) Get incredibly clear about:
- Exactly what will be done
- Who will do what
- What equipment/software will be needed
- Documentation of all agreements
2.) Request a proof-of-concept. Have them build you something (mock-up, etc.)
3.) Plan for analytics, data gathering
Organizing the Content
Next, they had to determine where the information was going to be held and how the information was going to be sorted. By role? What's the smallest chunk to reuse? Naming convention? So many considerations to be had (Meg shows an org screen showing what goes to who, and it looks like it's literally a 6 point font). Meg mentions that, while this is the most painful, drawn out portion of the process, it is the most significant...if you dig into it and begin writing without doing that, you'll have to go back more often than not.
Also, as far as organizing the development efforts, she marked off group by group (content, images, etc.) via mega spreadsheet. While overwhelming looking, it certainly made sure all the i's were dotted/t's crossed.
The Full Launch
When it was ready to go to the whole sales force, one platform was deployed to all. Later, they were able to add bookmarking (as more content was added). All in all, a phased approach to content was applied. Where it proved most useful was, say, a sales rep was 2 hours into an in-service and someone asked a question from the first 15 minutes. They were able to jump right back.
Solid anecdotal feedback was gathered from the sales rep, even some of the pilot champions. (Easily accessible reference tool, etc.)
Conclusion
Always great to hear someone walk through the twisted path they've gone through to make an amazing product. It's never as straight a line as people imagine, but Meg and her crew definitely made the best out of the entire process.
Subscribe to:
Post Comments (Atom)
Wow its a very good post. The information provided by you is really very good and helpful for me. Keep sharing good information.
ReplyDeleteSoftware Testing Services
Software Testing Company
Software Testing Companies
QA Testing Services
Functional Testing Services
Test Automation Services
Functional Testing Company
Performance Testing Services
Security Testing Services
API Testing Services