10-min read
·
May 1, 2022
Community

6 Experiments to Try in Your Community

Alex Angel

Lessons in experiments from community professionals who’ve tried them.

Every community has its own personality. It’s an inherently human thing, and the relationships people make, the environment that’s created, and the way people show up all contribute to a unique shared space.

It is precisely because of this amazing uniqueness that community professionals have to be creative and agile, anticipating and responding to whatever each day brings. The best laid plans may be unceremoniously thrown out the window, and community folks have to be ready to move on and move forward with new ideas at the drop of a hat. Okay, maybe not that spontaneously… but quarterly, for sure!

Experimentation, whether you contextualize it that way in your mind or your organization, is a community pro’s bread and butter. It’s their best tool for success. Every experiment that’s run yields key information on what sorts of things resonate with community members, and brings a community pro closer to delivering on core business objectives.

The beauty of an experiment is that you don’t have to succeed. If your experiment works, great. But if it doesn’t, it’s a critical lesson you can take with you moving forward that was worth your time and energy.

We asked six community pros to share some of the experiments they’ve run, and whether they were successful or not. You’ll learn why they thought it was the right move, what they did, and their ultimate learnings coming out the other side. Perhaps some of these experiments will be worth testing in your own community.

Polls

Michelle Fifis at Productboard

Experimentation is a key component of community building. Community programs can become stagnant and out of touch with members without experimentation. As community builders, it’s our job to bring these experiments to the community but not get too attached to our program ideas. If a program doesn’t resonate with the community, we can try to tweak the program, but we must remind ourselves that there may come a time when we have to close the program and move on.

The Experiment

While planning our Product Makers Q1 content calendar, our goal was to offer a mixture of content types that follow a regular cadence. We were striving to balance live events, long- and short-form community discussions, videos, and polls. We thought that a weekly poll would be an excellent way for community members to easily voice their opinions and check in with their peers. Who doesn’t love a quick poll?

Well, apparently, our community.

Our participation rates were disappointing and dropped off significantly after several weeks. As a result, we made several minor tweaks to the program, including adding a discussion prompt to the poll, putting ‘poll’ in the headline, making the poll more thought-provoking, and making it more fun and ‘coffee talk’ style. Still, nothing seemed to resonate with the community. So, while we haven’t abandoned polls completely, it became apparent that polls do not provide the value our community members are looking for, and we now only use them occasionally.

The lesson

Our great poll experiment of 2022 was an important reminder that not all the engagement tactics we read in the top 10 lists will resonate with our community. But, in a way, this divergence from the norm is what makes our community so unique and enjoyable to build.

Gamification

Samantha Pennington, Head of Community at Streamloots

This year, we launched an incentivized pilot program called Streamloots Rewards, with the goal of activating, engaging, and rewarding our community. By using Discord to connect, share, and learn from each other, streamers would earn experience points, climb the ranks, and receive prizes.

The experiment

We leveraged the MEE6 bot to map out 10 ranks that our community could achieve, with XP generated for every minute of conversation and additional XP doled out manually by staff for completing certain high-value actions – like referring a friend, creating content about Streamloots, or hosting a related event.

In just over a week, two streamers had achieved our highest rank: Legend. We underestimated just how much our community loves to chat. Within the first day, it became apparent that leveling up might be a little too easy. While chatting is great, we also wanted to encourage meaningful involvement and direct actions that can help us reach new streamers and grow the community. We did a fast follow-up and adjusted the XP required for higher ranks, along with the rate at which streamers could generate XP, so Streamloots Rewards was achievable but still required some effort and intentionality.

The lesson

We saw unprecedented engagement and activity in our Discord – an almost 95% increase from the previous month. With so many great conversations all across the channels, Discord got a little chaotic. To minimize overwhelm, streamline where conversations happen, and help streamers find the ones they care about, we recommended and began to model Discord best practices like replies and threads.

Adding groups

Audrey Stevenson, Senior Specialist, Community Strategy and Integration at SAP

In a community of hundreds of thousands of daily users, experimenting is not something we take lightly. It’s a matter of scale: is the ‘experiment’ something we can try in a limited way, maybe a new standard text for our moderators to use to encourage appropriate behavior? Or is it something larger, like a new feature?

While smaller experiments are something we do frequently, for larger experiments, more planning is involved. Especially for experiments that involve changing an existing experience: those require extensive user testing. Change can be difficult — welcomed by some, loathed by others. So we are unlikely to experiment with changes to existing functionality; instead, we plan and test, test, test.

The experiment

There are, however, circumstances when we can experiment on a larger scale: adding something not new. If a feature has been part of a platform for a long time and other companies use it, it’s easier for us to trust that the feature will work when we roll it out in our own community. In those cases, the uncertainty lies in our community’s reaction. Here, the trick is to always start small.

That’s what we did recently when we decided to experiment with adding groups as a functionality. We had received requests over the years, but we weren’t sure how our members would react to the new content type. So we rolled out just two groups, in conjunction with our marquee event. In the end, community uptake was positive, but if that had not been the case, we could easily have closed two groups.

The lesson

Bottom line in a community of scale: experiment cautiously, try to start small, and listen to feedback.

Creating a minimum viable community

Sara Saunders, Community Operations Consultant

As community founders, we rely heavily on experimenting within our community, but sometimes the experiment ends during research and discovery.  That was the case for a Minimum Viable Community (MVC) for Helping Creatives in May 2021.

The experiment

Research, research, research! I studied other communities tackling different areas of the same ideal member and where potential members were already showing up online.  While I was clear on the mission, values, and vision for the community, I wasn’t sure how I was going to bring an MVC that felt manageable, authentic, and vastly different from the communities already in the space. The more I dived into the communities that already existed, serving the same potential members, I saw that what I planned for the MVC launch was already being met and exceeded in many other communities.

The Lesson

Two things stopped my MVC launch:

  1. Lack of differentiation in offerings compared to competitors in the same community space and potential members were already well engaged in existing communities.
  2. As I planned the MVC, I quickly realized I didn’t have the time to commit to the community that it needed.

With these two realizations, I knew it was time to pause the launch until I could resolve those barriers. After a year of groundwork and new availability to give the community the time commitment it needs, I look forward to launching Helping Creatives in 2022.

Internal community

Jillian Bejtlich, Director of Community, Documentation, and User Education at Zapier

If you ask anyone who has ever worked with me, they can attest that I often talk about experimenting in community as “throwing spaghetti at the wall and seeing what sticks.” It’s just what we do. Experiment, iterate, accept a healthy dose of success, and some epic fails along the way. After that? Just keep throwing spaghetti.

The experiment

One of my favorite (successful) spaghetti throwing sprees I’ve ever run was for an internal community of knowledge. The organization was super acronym centric almost to the point of acronym usage being a badge of honor. It was, unfortunately, commonplace to be in a fast-paced meeting not understanding every third word and having no opportunity to ask what those acronyms meant. It was not a great experience, especially for new employees.

The lesson

I had a theory that if employees had a place to share and search for acronyms and their meanings with no barrier to entry, they would. Searching, adding, and modifying had to be as easy as breathing, so I set up a part of our internal community as a totally open, highly searchable wiki called Acropedia. I added 37 acronyms I knew of.

Within a year, we had over 3,000 acronyms contributed by hundreds of employees across the organization. Last I heard, it’s doubled. The moment I knew the experiment was a success was when I found myself walking past conference rooms and spying Acropedia up on the screens of employees furiously searching as they kept up with the meeting. Win!

Mentorship program

Grace Cheung, Social Media and Community Manager at Lattice

Once I started managing Lattice’s Resources for Humans community, I started to notice that a common theme popping up amongst our community members was the desire to take their careers up a notch.

Aside from recommendations on professional development courses, HR certifications, etc., there were a growing number of people asking to connect with someone who was on a more senior level or members who were seeking mentors. This need sparked our community mentorship program, which was simply called the Resources for Humans Mentorship Program.

The experiment

I’d never run a mentorship program before, but I couldn’t let that stop me, so I just made sure to do my research before doing anything. I chatted with other community managers who had developed mentorship programs previously, set up one-on-one calls with community members who I knew had been in mentorship programs themselves or who had run one for their companies, and simply Googled best practices for mentoring. After much research, I mapped out what our member journey looked like for this program, created an application form, wrote up a program handbook, and created surveys to gather feedback from program participants.

Like any community program, it wasn’t perfect from the start. From matching mentors with mentees to sending out a whole slew of emails over the course of four months, it was a very manual process that took up a lot of my time. And, of course, there were a lot of things out of my control. People not responding to Slack DMs (our community is on Slack) or emails, low completion rate on our feedback surveys, time zones not aligning, and mentees/mentors ghosting their assigned person were just a few of the problems that cropped up.

Despite that, the program was overall successful in the first iteration, and I saw a 78% participation rate from the count of pairings that I had set up. Members who did fill out our surveys were overwhelmingly happy that we had created a mentorship program, and most mentees were happy with who they had been paired with. With the constructive feedback that I received, there were a lot of learnings to apply to future mentorship semesters (I do a Spring and Fall semester now).

The lesson

With the lessons learned and help from a small brain trust of community members who participated in that first cohort, I launched a few new things this year to improve the program, such as a virtual orientation call, a revamped handbook for mentors and mentees to use, an improved application to increase the success rate of matches, and professional development and networking events for our members throughout the year to complement the program.

What worked: Listening to community members throughout the process, giving program participants a handbook to guide them through the process, and doing constant check-ins throughout the semester to make sure things were going well.

What didn’t work: A shorter application meant less than perfect matches, doing things with Google sheets was too time-consuming and we were missing a way to more easily (and quantitatively) measure success

Related content

All resources
No items found.