Shifting Camtasia into the “AI Video Editor” Category with a Trust-First Framework
Company: TechSmith (Camtasia)
Role: Senior Marketing Content Strategist
Timing: Oct 2025
In October 2025, I created the HUMAN Framework for AI in Training Videos. It's a memorable, audience-first framework that helped reposition TechSmith's product Camtasia as credibly AI-powered (without the “AI jazz hands”), supported by customer-informed research and a pillar post that became one of our strongest-performing newsletter features of the year.
The situation
At trade shows and industry events, we were hearing a consistent theme: customers and prospects didn’t realize TechSmith had AI-powered tools, even though we did. At the same time, our core audience (corporate trainers and instructional content creators) was skeptical of “AI for the sake of AI.” Accuracy and trust matter in training content, and polished-but-wrong AI output is pretty easy for experts to spot.
So we had two problems:
Messaging mismatch: Camtasia didn’t come to mind for people as an “AI video editor.”
Adoption friction: even when our audience was open to AI, many were still figuring out how to use it effectively.
I wanted to create something that felt practical, respectful of their craft, and repeatable enough to spread.
Constraints
Category noise: “AI video editor” was becoming a buzzword soup, and our audience could smell fluff from a mile away.
Trust-heavy audience: corporate trainers don't really care about novelty; they're worried about accuracy, clarity, and learner outcomes.
Credibility mattered: to land thought leadership in this space, the message needed to feel grounded in the world of training (not just marketing).
The insight
If we wanted to earn the “AI” association without triggering skepticism, we needed to lead with what our audience values: expertise, intent, authenticity, quality, and review.
So I built a framework that guides how to use AI in training videos.
What I did
1) Created the HUMAN Framework
HUMAN stands for:
Harness your expertise
Understand your audience
Make it authentic, not artificial
Aim for better, not just faster
Never skip reviews
The goal wasn’t to hype AI. It was to give trainers a foundation for using AI in a way that protects learner trust and content accuracy—while still getting real efficiency gains.
2) Built the framework on real audience signal
To make sure HUMAN reflected the reality of training work, I pulled insights from multiple sources:
Trade show retros: themes from conference debriefs (including a learning & development event) that sparked the initial idea
Training Advisory Board: real customers reviewed the framework and content throughout development; feedback shaped the final messaging
The Visual Lounge podcast archive: I mined relevant episode transcripts and pulled out recurring themes around AI in training/video creation
Dovetail user interviews: I reviewed interviews to understand how/why customers would use AI features, and where skepticism was showing up.
3) Wrote the pillar post and built a distribution-ready asset set
I authored the pillar article: “How to Use AI in Training Videos the HUMAN Way.” It lives as a blog post on TechSmith’s site.
Important context: the published byline is a well-known training-space voice on my team (and host of our podcast). I ghostwrote the piece intentionally to increase credibility and reach with the intended audience.
I also partnered with a design vendor to create visual assets to support:
social promotion
email/newsletter inclusion
PR use
4) Turned supporting research into a story people could trust
In parallel, TechSmith ran two global viewer studies on instructional video formats. Participants watched short training clips that were identical except for narration voice or presenter format (including AI voices and AI avatars), then:
rated professionalism, confidence, and engagement
completed a short quiz to measure retention
I helped shape the research questions, build the narrative, and create supporting content assets and distribution around the findings, using the research to reinforce HUMAN’s practical stance on where AI helps, where it falls short, and why review/accuracy still matters.
5) Enabled internal teams to use the framework consistently
I created internal documentation so teams knew:
what HUMAN is (and isn’t)
when to use it
how to talk about it with customers and prospects
I also worked with PR to reference HUMAN naturally in relevant opportunities, and the framework was incorporated into our podcast and used in conference presentations by team members speaking on behalf of TechSmith.
Results
Newsletter performance: When featured in our external newsletter, the HUMAN pillar post drove 3x more clicks than any other blog post we included that year.
On-page engagement + surprise revenue: Engagement on the post was strong and even contributed to unexpected sales (not the primary goal, since this was meant to be upper funnel/brand association).
Search visibility: In the months after publishing and promoting HUMAN, we nearly doubled site impressions for search terms including “AI,” and the post itself ranks for relevant queries like “create training videos with AI.”
Adoption: HUMAN became a shared language used beyond the blog, referenced in PR contexts, integrated into our podcast, and used in conference presentations.
What we learned
The clearest validation came from the people this was built for.
When our Training Advisory Board reviewed the first draft, they flagged sections with feedback like “HIGHLIGHT THIS" and "MAKE IT BOLD.” The reaction wasn’t polite approval; it was that kind of “hell yeah” response that signals you're on to something, and solving a real “okay but how do I use AI?” problem without talking down to them.