Please contact us to set up your account with Precision Content
Content Strategy Applied – San Jose, CA – Oct 30, 2019 Consumers don’t search for information, they look for answers. Emerging technologies around conversational interfaces are driving changes to how we create, manage, and deliver the answers they seek. In this presentation, Rob Hanna explores why we must stop attempting to strong-arm a solution — piling new content on top of old content, creating new silos of information upon which we have to wade through and try to keep current. Hanna will discuss why we need to distill our content down to appropriately-sized chunks of information in the form of intelligent microcontent in order to efficiently provide it to others in a manner that enables consumption and reintegration across different business functions in the enterprise.
LavaCon 2019, Portland OR – October 29, 2019 Content distilled down to appropriate chunks of information in the form of intelligent microcontent take on a new life when they are made available to the enterprise for consumption and reintegration across different business functions. The silos all but disappear when content is rendered to microcontent that is classed, focused, structured, and contextualized. This microcontent then can flow freely across product, marketing, training, and support documentation. Download the presentation to learn more about this transformational opportunity.
STC NY – June 13, 2019 – Rob Hanna travelled to New York City to share his presentation on a new way to approach instructional design and develop learning materials that scale down from classroom instruction to just-in-time microlearning covering the spectrum of the five moments of need:
1. New—When learning to do something for the first time
2. More—When expanding the breadth and depth of what has been learned
3. Apply—When people need to act upon what they have learned, adapt to new challenges
4. Solve—When dealing with problems as they arise.
5. Change—When people need to learn a new way of doing something; it requires them to change skills that are deeply ingrained in their performance practices.
DITA North America Conference – April 16, 2019
Intelligent agents and AI-powered cognitive content solutions perform best with machine-ready content—intelligent content designed to be read by humans and processed by computers. To deliver the right answer to prospects and customers who have questions, you’ll need to optimize your content production approaches and begin crafting content with the precision humans appreciate, and machines require. Welcome to Intelligent Microcontent.
OmnichannelX Conference, Amsterdam – January 31, 2019
It isn’t enough to deliver the right content to the right people at the right time. Your customers don’t just need pages anymore, they need answers. The move to chatbots and voice user-interfaces will change how we deliver content that allows users to seamlessly shift modalities between seeking answers and exploring content. You still have to maintain traditional publishing channels while serving the needs of emerging channels. It only makes sense to do it from the same source of content.
For new voice-based channels, your content needs to be concise and aligned with specific user questions, while still feeling natural when being read on a screen and read out in text-to-speech. This means each piece of content needs to be carefully written with a specific intended user response.
Learn more about how you can transform your content using microcontent to not only enable emerging technologies but also make your traditional channels richer and far easier to use. We cannot wait for new disruptive technologies to land on our desktop to start rethinking our content. Making content easier to use for machines will vastly improve the accessibility of your content for human consumers in every way.
What you’ll learn
STC 2018, Orlando, FL. – May 21, 2018 – Intelligent agents and AI-powered cognitive content solutions perform best with machine-ready content—intelligent content designed to be read by humans and processed by computers. To deliver the right answer to prospects and customers who have questions, you’ll need to optimize your content production approaches and begin crafting content with the precision humans appreciate, and machines require. There’s no reason to create new departments full of writers dedicated to creating content specifically for chatbots and voice interfaces. A better solution is to leverage existing marketing and product content for these new delivery channels. To do so successfully, and at scale, you’ll need bridge the gap between technical communication and marketing by developing a coordinated effort designed to produce high-quality content at scale; content that is accurate, timely, and contextually-relevant. Attend this session to better understand the need for intelligent content and its applicability to chatbots, voice interfaces, and intelligent agents. You’ll discover the world of micro-content and the importance of framing information for intended user responses. And, you’ll find out how aligning information types to memory encoding principles can help you dramatically improve content performance. > Download presentation
[Webinar] Dec. 12, 2017 — We’ve all heard about the benefits of content modeling and structured content for technical information. We’ve implemented DITA or other topic-based strategies to break content into smaller blocks to manage and publish, but topics are not small enough. But the addition of Bots, voice-enabled interfaces, and AI means we must change the way we structure content. We are moving from a broadcast style of communication – publish and hope for the best – to a more conversational style of communication. More question and answer. This imposes requirements on the content models you need to create if you want to talk to the Bots. We must be more granular in our models. We need to implement Microcontent. > View the recording
[Conference] Nov. 7, 2017, LavaCon 2017, Portland, OR — What if we could transform structured content into searchable, reusable, chunks of content that other groups could easily find from a highly trusted source such that they could reuse it in their slides, support sites, proposals, and emails? The potential for improved rigour and precision of Content 4.0 offers greater utility and effectiveness for content delivery. In this session, Rob Hanna explores the methods, technology, and use cases needed to support delivery of DITA/XML as microcontent across the enterprise.
> Download the presentation
[Webinar] Sept. 13, 2017 — You may already know the value of content that is constructed based on its purpose rather than just presentation. But many solutions today lock that value into one product or silo, inhibiting collaboration and reuse across the enterprise. The Darwin Information Typing Architecture (DITA) is a widely-adopted XML standard that delivers on this promise of intelligent content, but there are some inhibitors to adoption. Some see it as too complex. Others work to create content in compliance with an existing content format such as HTML or Markdown. With Lightweight DITA in the works, we can finally bridge the gap between the promise of intelligent content and the challenge of diverse authoring platforms and communities. Featured speakers are Michael Priestley, Rob Hanna, and Steve Manning. > View the recording
[Webinar] May 31, 2017 — Over the past decade, thousands of organizations around the globe have adopted the Darwin Information Typing Architecture (DITA) to help them improve the way they create, manage, translate, and deliver content to prospects and customers. And yet, despite the many benefits DITA can provide, not everyone who has made the move is satisfied with the experience. Join Scott Abel, The Content Wrangler, and my special guests, Rob Hanna, Mark Lewis, and Keith Schengili-Roberts—three content strategy experts with deep experience solving content conundrums with DITA, as we discuss the results of our 2017 DITA Satisfaction Survey. Our panel will review the top reasons survey respondents are dissatisfied with their DITA implementations. We’ll also discuss why some DITA projects fail and provide advice on how to best overcome these challenges (or avoid them altogether). > View the recording