InstructureCon 23 Conference Notes

Good conference and message if you are willing to overlook pseudo-demos

Was this forwarded to you by a friend? Sign up, and get your own copy of the news that matters sent to your inbox every week. Sign up for the On EdTech newsletter. Interested in additional analysis? Try with our 30-day free trial and Upgrade to the On EdTech+ newsletter.

July is typically the busiest month for LMS user conferences, and last month, Glenda Morgan, Jeanette Wiseman, and I tag teamed to attend the users conferences for D2L Fusion in Anaheim, Anthology (Blackboard) Together in Nashville, and InstructureCon in Denver. With this being the first year with our premium newsletter On EdTech+, we would like to avoid duplicate content, and to do so we are publishing each post with free content above a paywall break followed by content for premium subscribers. This post covers InstructureCon.


In the 2010s, Instructure was the master of customer marketing in EdTech. Customers of the company’s Canvas LMS were the ones who really helped sell the system to others by word of mouth, and InstructureCon played a key role in enabling this strategy. Get people together in a cloistered environment (at a resort away from distractions), make it fun, encourage everyone to interact, and show your company culture by how the event was run. Michael Feldstein noted this as part of Instructure’s different style of openness in his coverage of the 2016 users conference.

Instructure’s particular take on openness shows up in other ways too. For example, by lowering customer anxiety regarding future uncertainty, the company is able to create a lot of space at the conference for networking. The primary activity at Instructurecon seems to be sitting down over meals, at parties, or in lounge chairs and chatting with folks. This is true in of most conferences in some sense. The difference is that Instructure creates an environment in which customers don’t feel torn between attending sessions in hopes of gleaning scraps of information about product development and having high-value informal and opportunistic conversations. Instructurecon is not really a conference so much as it is an extremely well executed social event.

Since that time, Instructure had its CEO Josh Coates leave the company in 2018 with Dan Goldsmith (blog commenter extraordinaire) taking over, went private in early 2020 with a private equity Thoma Bravo acquisition (a true soap opera of a process), changed CEOs again with Steve Daly replacing Goldsmith, and then went public again in 2021 (although Thoma Bravo still owns 88% of the company).

It would be unfair to expect Instructure of today to be the same as Instructure of the mid-2010s, and in particular we should not expect post-pandemic InstructureCon to be the same event.

But there are interesting comparisons to be made, particularly around openness. Today Instructure is one of the best-run EdTech companies around, with SUNY Geneseo being the only significant full-campus defection in higher education, and one that (unlike the 2010s version) is (arguably) profitable and achieving financial results in stock price and revenue / earnings improvements while still leading the market. Yes, D2L has won some major head-to-head competitions, leading to a two-horse race for most LMS evaluations, but that is more from D2L improvements than from Canvas failures.

This leads to my two main observations about this year’s InstructureCon.

  • Instructure is still a solid company whose customers by-and-large trust them and are benefitting from their partnerships (i.e., don’t expect any big exodus); but

  • The biggest risk Instructure faces is its loss of product openness, as unlike D2L and Anthology, Instructure is now resorting to pseudo-demos that show a lack of trust in either the product or customer reactions.

Keep On Keeping On

There was no real change in strategy seen at InstructureCon. The company is increasingly emphasizing its portfolio of products built around the Canvas LMS, what they call the Instructure Unified Learning Platform. Perhaps the strongest change in message is the increased emphasis on the EdTech Collective, Instructure’s partner ecosystem. In fact, two of the three conference press releases were on the ecosystem - describing the 850 partners as “a larger partner community than any other LMS provider” and announcing a partnership with Khan Academy with its Khanmigo AI-based tutoring and teaching assistant tool (more on generative AI approach below).

In terms of Instructure’s own tools, the general themes were on incremental changes to “designed to save educators time, personalize learning experiences for students and simplify complex tasks for administrators.” There was nothing too exciting here, but general product usability improvements do not need to be exciting as long as they hit the market need and delivery real value.

Instructure also announced big wins at Duke University (the last major US Sakai campus) and Ohio University (from Anthology / Blackboard). OK, these wins weren’t really announced at InstructureCon, but they were highlighted there.

The most problematic product development issue of Instructure’s history (if you ignore corporate learning and Bridge) has been the long-running rollout of New Quizzes. It was prematurely announced in 2016 and struggling ever since (in Learn Ultra fashion) to get enough feature parity to convince current customers to migrate. To Instructure’s credit, the first part of their product keynote directly addressed New Quizzes, although without real metrics on adoption. Yes, customers are much happier with the recent product development and migrating, but there is no stampede. Like Learn Ultra.

Overall, the message is that Instructure is continuing its strategy and doing a good job with the new InstructureCon in encouraging customer interactions.

Product Openness

Part of the reason I brought up the 2016 InstructureCon coverage is that first part that Michael mentioned.

Instructure’s particular take on openness shows up in other ways too. For example, by lowering customer anxiety regarding future uncertainty, the company is able to create a lot of space at the conference for networking.

This general approach works if customers trust Instructure with its product development, and I believe they did - customers really responded to the type of openness that Instructure had. I give the current company high marks for transparency on sharing a product roadmap, but there was no real depth or trust at the conference in sharing in what these product enhancements mean.

For the top right roadmap item, how will the recurring events be created, and how will this save time in reality? Is this a design that works for most educators or only in a specific use case? It is fine to avoid these details in a high-level roadmap page, but there should be some attempt to show new features at a users conference. In EdTech the context and nuance of product design and whether educators in particular will truly benefit from time-saving or learning experience features all matter quite a bit. And it is important culturally that Canvas became what it is largely because of its design choices, moving beyond a checklist (the LMS has this and that) to a true usage mindset (that design really works for me in my courses).

This year, Instructure chose a main-stage style of drive-by feature references with the same level of detail as the roadmap description. If you went to the “deep dive” breakouts, there was not much more, if anything. Rather than openness, Instructure opted for control.

Almost all product demos were not live demos, but rather pre-recorded video snippets. At one point this approach led to a product executive bringing out his laptop to “show a demo” of a feature set, with a light attempt to move his hand over the trackpad when the screen clearly was showing a canned video. There was no demo, and the attempt to fool the crowd was absurd. This is not just a matter of theatrics - it is a matter of trust.

Instructure either lacks trust that a live demo would work flawlessly, or that customers would understand any slip ups, or both. D2L and Anthology (Blackboard) did not do this - they shared real demos at their conferences and trusted their customers. Why is Instructure taking this approach? The product does work, and customers have a lot of built-up trust in the company. There is no reason to squander this trust by trying to tightly control the narrative.

Generative AI

Instructure has developed a set of generative AI guiding principles that address responsible use, transparency, privacy, bias, and human-AI interaction. The actual AI developments could be broken down into three areas:

  • Partnerships such as with Khan Academy (primarily Khanmigo at this point);

  • Broader-based point solutions embedded into Canvas; and

  • A research lab looking at trickier potential AI usage not planned for the product yet.

Khanmigo is an interesting tool that can give additional prompts and guidance while a student is writing an essay - not writing the essay, but nudging the student in the right direction. Students can write an outline of an assigned essay, get feedback, and chat with Khanmigo. This tool will be integrated such as it is accessible in the LMS. Khan Academy in general is more popular in K-12 than higher ed, although there is crossover usage. I suspect this Khanmigo integration will have its primary impact in K-12.

One of the two primary areas of integrating AI into Canvas is a course templating tool, somewhat similar to what Anthology (Blackboard) is adding; note that Morgan had some commentary on the likely usefulness of this approach (hint - she has doubts and questions). The more useful area for Canvas is a planned natural language query to access systems analytics. In the main stage demo on this topic, the example was asking “what percentage of our courses are processed across the account", with a resultant graphical answer along with generated text descriptive answer. This is interesting in potentially making it much easier for educators to get actual access to system data, effectively democratizing the usage of analytics.

The third area was an “exploration on learning-relevant use of emerging technology in experimental ways, but not yet available to customers.” I was somewhat surprised that Instructure would choose to get into these thornier areas, such as correct and wrong answer rationale “with inline chat clarifications on existing teacher quizzes, and if desired, generating new formative multiple-choice questions,” and summarizing course notes and aligning of auto-alignment of content to learning standards. These are all areas that are interesting, but they raise tricky questions. For example, a good friend pointed out that if the answer rationale works, how would this feedback count (or not) with the requirement of regular and substantive interaction regulatory guidance? How would faculty be able to see what feedback was given to students and whether it is accurate.

Subscribe to On EdTech+ to read the rest.

Become a paying subscriber of On EdTech+ to get access to this post and other subscriber-only content.

Already a paying subscriber? Sign In

A subscription gets you:
New content 3-4 times per week
Shared Q&A discussions
More coming soon