Creating a Safe Space for Your Members

Episode Thumbnail
This is a podcast episode titled, Creating a Safe Space for Your Members. The summary for this episode is: <p><span style="color: rgb(0, 0, 0);">This episode is a recording of a session from our virtual user conference, Super Forum. To assist with monitoring and addressing inappropriate behavior such as trolling, overly promotional or harassing posts, the American Association for the Advancement of Science developed robust protocols, canned responses and enhanced T&amp;Cs to help make our community a safer space for members, working with our in-house human rights group. In our session, we will share the ins and outs of our protocols, and our best practices for having a more harmonized online space and what we found our biggest challenges and opportunities were.</span></p>
Stronger Terms and Conditions
00:31 MIN
Nurture Vulnerable Groups
00:29 MIN
Building Protocol
00:39 MIN
Define Behaviors
00:36 MIN
Discuss More Functionality
00:34 MIN

Alex Mastrianni: Welcome to The Member Engagement Show with Higher Logic, the podcast for association professionals looking to boost retention, gain new members, and deepen member involvement.

Heather McNair: Throughout our show, we'll bring on some experts, talk shop about engagement, and you'll walk away with strategies proven to transform your organization. I'm Heather McNair.

Alex Mastrianni: I'm Alex Mastrianni and we're happy you're here. Hey everybody, welcome back to another episode of The Member Engagement Show. Today, we are taking it all the way back to November of 2020, which seems like a really long time ago, but was really only four or five months ago, when we hosted our annual customer conference super forum. And while it was a little bit different than normal, because it was virtual, we had a ton of great content that we covered.

Heather McNair: Yeah, it is very funny how time is just a little warped these days, isn't it?

Alex Mastrianni: Yes.

Heather McNair: And we know everyone couldn't attend, but we had some really, really fantastic content that we don't want people to miss out on. And so we thought we'd bring a little bit of super forum to you through the podcast today. So today's episode is coming to you from Emileigh Clare and Candace Freeman from the American Association for the Advancement of Science.

Alex Mastrianni: Yes. This session was all about creating a safe space for your members, which is so important today. You might not think that you'd have to worry about something like a professional, private community space being a place where things like harassment or personal attacks could come up. But occasionally it can happen from time to time.

Heather McNair: Yeah, fortunately, it doesn't happen very often. I know that's one of the... sometimes it can be a big fear about people as they jump into the online community space. There tends to be a much higher level of accountability in private communities, which is good, and that is one of the things that Emileigh and Candace talk about today. Creating that sense of safety, which is one of their big themes. It's always been a critical piece in getting community members to participate. And that goes across any online community. And our guests from AAAS talk about that. They talk about how it's paramount, also, for women and groups of color, which obviously is a very hot topic these days. And one of the things that... It was very eye- opening for me. A few years ago, I was actually talking to one of our clients and she's a young woman in the technology sector. And she said that she actually won't post in most technology communities because she has been attacked by men in these forums. And they challenge her knowledge because she's a woman and that was very startling to me and it was rather disturbing. And actually are our women from AAAS talk about that and creating that safe space and how important that is. And I loved the things that they've done to ensure that these groups do have a safe space and feel very comfortable participating in their community.

Alex Mastrianni: Yeah. When that happens to someone, especially if it's a reaction or a response from someone that they're not anticipating, it really can make it hard for them to engage, feel comfortable, feel trust with the organization. So it's really important that an organization has steps to respond to this or remove the comments or do something to show what's acceptable for the community, for the space. And avoid things like that from happening in the future.

Heather McNair: Yeah, absolutely. Because if it happens to you once, you won't come back. And if you see it happen to someone who's like you, you're far less apt to participate as well. So it is one of those things, as someone who's running a community, hosting a community, you do want to ensure that those things don't happen. And that's just one of the things. We've seen a lot of things, several big things happen, over the last couple of years that have really brought this idea of creating a safe space in your online community to the forefront. Like I said, it's always been there. It's always been important, but it is right there, front and center, these days. Then obviously one of the first things that springs to mind, the pandemic, we can't ignore that. And I am so tired of hearing the word uncertainty. I hope after this year we never hear it again.

Alex Mastrianni: How about unprecedented, though?

Heather McNair: Unprecedented is another one which kills me because it is precedented, but that's a whole other argument. No, but so, uncertainty... But the reality is, there has been a lot of uncertainty, and so people were looking for a place, have been looking for a place, to still connect with friends, to still connect with colleagues. Where they could get information they could trust. And they wanted a place that felt welcoming, that felt familiar, and they turned to their online professional communities to fulfill that need. I was actually on a call with several of our MVPs from the Higher Logic user group from HUG last week and a couple of them referred to it as their professional home. And I love that term and I love that they feel that way. That's how we want people to feel about their online community, about their professional community.

Alex Mastrianni: That's how I feel about the OC, which is the internal community that we have at Higher Logic, with our fellow employees and definitely the community that I participate in and go to the most. I don't know what it would be like over the past year if we didn't have it. We're all working from home, we're disconnected, it's become a place that we go to for, not just company updates, but to hear from people who maybe we would typically see in the kitchen or passing by their desk. About what they did that weekend or what their kids are up to for spring break. Or fun posts about everyone's favorite furry co- workers and all the pandemic pups that have come into everyone's lives recently. So yeah, it's a great place to have those connections, and if something were to go wrong, it would be really uncomfortable or hard to stay connected and stay engaged. Feel like you can go there and trust what's happening with other people that are in the community.

Heather McNair: Yeah, absolutely. And I've heard from several of our coworkers that they actually feel more informed about what's going on in the company, since we've been working remotely, than they did when we were in the office. And I think that's a huge credit to how our leadership has been using the online community to push out information and to keep everyone connected. And, yeah, we have to have some substitute for connecting in the kitchen over beef jerky and Cheetos. Another thing, and I'm going to keep politics out of this, but we've run into issues on a lot of the major websites with data breaches over the past couple of years. And also the proliferation, wow, that's a tough word to say, of misinformation. You don't know what you can trust, where that information is coming from. And so people really want to know that they're sharing information in a place where they know who the other people are. Is Alex Mastrianni really Alex Mastrianni? And are these people credible? When we're talking with AAAS, we're talking about scientists, and so they're sharing information and the ladies will talk about this today. Is the information they're sharing legitimate and trustworthy? And there's a lot to be said for that with a professional online community. And this is connected to your database, these people really are who they say they are. And they will also talk about some ideas around being able to verify that data and have people, other people in the community, verify that. Vote on that.

Alex Mastrianni: Yes. So without further ado, let's turn it over to Emileigh and Candace about some of the challenges that AAAS overcame and how they have made their community safer for members.

Emileigh Clare: I'm Emileigh Clare, the Assistant Director of Member Engagement at AAAS. And I've been working in the fields of mass media, communication, trade and members associations for more than 10 years.

Candace Freeman: And I'm Candace, Senior Manager of Membership training with AAAS. And I have 12 years of experience in nonprofit contact center training, quality assurance, and operations.

Emileigh Clare: So before we get started, we're going to give you a little background on our organization, AAAS. Not AS, not AAA, but AAAS. We are the world's largest multidisciplinary, scientific society and a leading publisher of cutting edge research through our Science Family of Journals, which is how most people know us. But it's important to know that we are not just a publishing organization, we have individual members in more than 91 countries around the globe. Our membership is open to anyone who shares our goals and belief about science and technology, engineering and mathematics, can help solve many of the challenges our world is facing today. And it's also important to note, too, that many of our members are not just scientists, roughly about 11% or so are not scientists and join AAAS mainly because they feel it's important to support our organization in advancing science and serving society. So that's really great. About a year and a half ago, we launched our community. At that point, our primary goals were to build member engagement with staff in our organization and let them know what we're doing, provide a space for networking, which was a primary reason for a lot of our members to join our organization. And also increase retention, make sure that they want to continue renewing their membership each year. So when we launched our community back in April, 2019, I would say that we had a very... I would say almost a little Pollyanna, a little optimistic. Our terms and conditions were pretty open- ended, pretty loose, and I think that they were maybe a little rose- tinted glasses that we had on at the time. Because we had some assumptions that our members would automatically be collegial with one another and we were soon to discover that wasn't always going to be the case. We also started to see our members use community for unintended things, like sharing that they had updated their address, and everyone then could see their new address, or asking where their t- shirt was. And I mean, this is in a community of over 100, 000 people. And these types of things, they were not against our terms and conditions per se, but that was still not that engaging behavior we wanted to create. And then more importantly, I think for us, was when it came to topics like climate change, racism and STEM, whether vaccines are good and bad, for example. We saw a variety of perspectives and opinions. We saw even trolling from people, and it blows the mind, but some people will have their full PhD signatures, their credentials and their academic backgrounds as they continue to troll one another. So we were like," Wow, this is new." My favorite example I like to talk about, just to share some of the stories that we were seeing, is we saw one member and another member having a back and forth, a hot debate, and one of them said," Stop the micturition contest." And I had to look up micturition, I did not know what that meant, and yes, that is the medical word for peeing. So as we continued to observe these heated debates and essays and back and forth between a few members, we saw threads just going in directions that could honestly reflect poorly on our organization and really didn't fit the purpose of our community site and what we were looking for. We also had a theory that this was going to lead to less people wanting to participate, especially when we tended to see a trend of older, white males, arguing and posting to the threads and using it for that. And then one other observation that we noticed post launched, we had a gap in resources and how we can handle these problematic posts and arguing and the back and forth. We hadn't initially anticipated needing to read through and monitor through so many things and how much time that could take up. And then we also found that, not only do we need to read through these things, but we also had to have a plan in place to respond to those who would complain about somebody else. And also a plan to deal with the offender. So at the end of 2019, we organized staff together with diverse backgrounds and skills, to discuss how we could do better in this. We came up with a monitoring plan to handle things while we assessed updating our terms and conditions and started that assessment process. And I'll let Candace tell you about that process.

Candace Freeman: So initially, in response to the unexpected behaviors on community, like people posting service requests or their address update, we wrote protocols for how to move these requests to our case system and out of the discussion thread. And we created canned responses to communicate the actual purpose of the discussion threads. And we assigned community monitor shifts to our support staff. They did a great job of following those initial protocols and making it clear what the purpose of the community is. And once members got the hang of how they're supposed to use community, it became a lot more like the social media or networking tool that we had intended for it to be. However, the threads did start to look a lot the problematic threads that we've all seen on Facebook or other social media. We quickly learned that while scientists may be knowledgeable experts in their field, they are prone to the same conversational pitfalls as everyone else. So the community monitor role, it actually needed to expand beyond just watching for service requests, and our staff needed more tools and guidance on how to keep up with that need. So we decided that stronger terms and conditions were needed to make the community a better space for our members. And so when holding users to a standard of conduct, we wanted clear, objective expectations that the user could access, to see what they're going to be held to. And then our staff needed the terms and conditions to support and provide rationale for any monitoring actions that they took. So we wanted them to be able to say, if they had to remove a comment and send the member a message, one of them to be able to say," I've removed your comment because it violated X portion of our terms and conditions," and then link our terms and conditions in that response. So to develop these new terms and conditions, we actually started by developing a mission and vision statement. And the purpose of this was to focus members on our values and on the purpose of the site. In a previous role at another organization, I was on a team that determined training and quality assurance guidelines for staff. And for a while, our meetings consisted of this well- meaning team arguing in circles about best practice and what to do. And that continued until we made the inspired decision to come together and come up with a mission and vision statement, then let that be the determining factor, or the tie breaker, for any arguments about next steps. And it worked. And our decision making went much smoother after we implemented that mission and vision statement. And so similarly, here at AAAS, we have a team of smart, passionate people leading AAAS community. So it made sense to follow a similar exercise for fleshing out our terms and conditions. And so this mission and vision statement really helped lay the foundation and set the stage for our terms and conditions. And it created a clear lane and objectives for our AAAS community discussion thread monitoring. So Emileigh and I actually came together to create the mission and vision statements, and then we included the team, and we together tweaked it until everyone felt good about where we finally landed with that.

Emileigh Clare: Yes and I'll just add, too, you're probably not surprised, but AAAS is lucky to have a lot of scientific expertise in our organization. But key feedback doesn't just come from those expected places. We chatted a lot with our annual meeting team and they provided a lot of great examples and their own personal experiences in how using strong language can be helpful against harassment, for example. They had had some poor experiences with members not understanding boundaries in the past at in- person events. And they advised us to include language on harassment and what will happen if a member continues engaging in a member of staff after they've been asked to stop. And so, at that point, nothing like that had happened, but we wanted to take all feedback into consideration and really create a safer space. And later on, after adding that language in, this actually did end up happening on our site. So that's just one good example, that you could really consult with everybody across the organization for their experiences on how to strengthen your terms and conditions and what kind of experiences they've had and what makes them feel safer. Even if it's not currently an issue at the moment in your online space. We also asked our human rights group to review our terms and condition, and they were able to take a break from using science to improve human rights around the world and give us the thumbs up that we were on the right path. And then on another feedback side, we tried it with a lot of our members and they have a lot of expertise and insight into online collaboration. What it's like for people who might be more vulnerable in online spaces, such as women and people of color. We also chatted with scientists who study online behavior, such as trolling, and what are trolls really like? What are their characteristics? What are their habits? So that was really interesting. And we also tried it with some members who studied just information and how that moves in online spaces, for example, the QANON movement. So we had a lot of expertise and a lot of insight. And so we're going to share some of those key learnings with you right now. I have another fun story, one member we chatted with, whose expertise is in online collaboration, gave us a really good real life example of this idea of echo chambers. And I thought, wow, we see this on community every day. So his example in real life was that inaudible really supporting Elizabeth Warren for president last year. And all of him and his friends and his peers and his colleagues were like," Yeah, Elizabeth Warren!" And when she didn't secure the nomination, he was really upset about it. And he started talking about with his family, his distant family, and his neighbors. And they were like," What? You're crazy. Why are you supporting Elizabeth Warren?" And he said for him, it was a very rude awakening. And he was just totally jarred by people being outside of his social bubble and hearing what people really thought of his preference. And he said," Those kind of situations play out all the time in online environments. When you take people out of their normal feed or social interactions and put them with people of different backgrounds and different perspectives, there can be rude awakenings and it can be uncomfortable and it can lead to tension." And that was a really great story because we see those types of things happen, not on a daily basis in community, but inaudible and maybe... do too. And so that's this idea of echo chamber happening. We also talked with some other experts, and this was really key for us, that if people log on and they see angry discussion, if they see trolling happening between certain members, they will automatically disengage. It only takes one login and one poor experience for that to happen. And we don't want that to happen. And we continued to learn that if we wanted more participation from women and people of color, to know and to understand that they feel particularly sensitive about engaging in communities right now. There's a lot of aggressive online behavior and if they do not feel like protected or nurtured, they don't need to be there. They can find better spaces. So it's really important to provide that. So with all of these caveats and barriers in the way, the one thing to note is that there is a good thing. There is a silver lining. All the studies, all the scientists that we've chatted with, have shared that if somebody does feel they're being harassed, if they are uncomfortable, the right thing to do is limit the visibility and posting rights of the harassers or whoever is doing the trolling. To really make that a comfortable space. So it is possible to do those things. So that's good news.

Candace Freeman: So we took all of these key learnings and we wrote our mission and vision statement and really let that be the foundation to then frame the rest of the development of our terms and conditions. And so we placed the mission and vision statement at the very top of our terms and conditions, to focus members on our values on the purpose of the sites, but also that mission and vision statement also helped us, as staff, make decisions about how to handle specific posts. And how or whether we should update our terms and conditions, or should we expand on them, to address specific inappropriate conduct? We ask ourselves," Does this person's post or comment align with our mission and vision? And is it covered in our terms and conditions? And if it's not, do we need to edit those terms and conditions to include that?" Emileigh?

Emileigh Clare: And we also wanted to make sure we say what hate speech is. What happens if you falsely misrepresent yourself, such as giving yourself multiple PhDs and credentials you haven't earned. And we also wanted stronger protection against anonymous posters who were using community to continue to argue or post offensive content under that anonymous feature, which was not what was intended. So for the few folks that were using that, we added a reminder that we, as staff, could tell who anonymous is at any time and they will continue to be subject to our terms and conditions. And ultimately, by incorporating the stronger language, as well as the mission and vision into our terms and conditions, we wanted to also show that we were willing to nurture more vulnerable groups. Say," We want you here. We have strong rules in place to protect your participation and certain harmful behaviors are not going to be allowed." It's important to say that's not just the intent, but also define and describe what is not allowed and let the members know.

Candace Freeman: So... all of that work and went ahead and published our terms and conditions, our updated terms and conditions, with our mission and vision statement clearly listed at the top to set the tone and introduce these robust terms and conditions. Behaviors that are categorized as inappropriate or unwelcome attention or harassment are addressed throughout the terms and conditions. And so for clarity, and to equip both staff and the actual users of the community with explicit expectations, we define these terms with a focus on the impact and how a poster's message may be perceived beyond just the poster's intent. So we went ahead and defined it, something inappropriate is anything that may be viewed as defamatory, et cetera. Unwelcome attention is any communication that continues after the recipient has asked you to stop. Harassment is any conduct that annoys, threatens, intimidates, et cetera. And so we wanted to make it clear that these things are not okay and this is exactly what that type of behavior looks like. And this helped equip our staff when they have to out that inappropriate behavior and can cite that this is the exact one item in the terms and conditions that this does not align with. And so once we had fleshed out these terms and conditions, we now have this robust T& Cs, we wanted to build out our protocol so that staff truly were equipped to monitor the community. And so a benefit of us having misjudged or miscalculated exactly how much work it was going to be for our staff means that we actually had a few weeks of the community being live, and a few weeks of conversations and discussions, to draw from. So we could draw from that actual experience as we crafted our scenarios and created our canned responses. And so we developed this protocol, we brought in the staff, had the staff provide input on the protocols, and we have very many different scenarios for how to handle each type of scenario. And then we conducted classroom training. So I trained all of the current staff on the protocol that they now have buy- in, because they helped create the protocol, and then trained them on how to use it. And then developed a quiz on the protocol to make sure that they really understood how we're going to be using that protocol. And then they began their actual community monitor shifts. And then for our new hires who come in, similarly, I train them on the protocol. They actually get to shadow current staff who are doing their community monitor shifts, so they can actually get that real life experience. And then they take the community monitor quiz, and then they begin to work those shifts. And these community monitor shifts are very involved. They work them in half day shifts, they monitor all new posts that come in on community. We actually keep a list of problematic posters and the types of comments that they post, just so that we can keep an eye out for them, so that each person on their shift is aware. And we keep a log of all the monitoring activity. Whether it's, we have to warn someone gently about our terms and conditions, all the way to, we had to deny someone's moderated post, or reject someone's moderated post, and inform them that they're going to have to either rewrite the post and repost it or just that that post is not acceptable and for what reason. And then we also use the watch words feature as well, which I think Emileigh will talk a little bit more about. But that's been really helpful for those posts that are automatically put into moderation that we then review.

Emileigh Clare: Right. So now that we've been live with our enhanced terms and conditions for 10 months, and we've actually even updated them again since then, I really want to drive home that we have seen a lot of evidence that adding stronger language, defining behaviors, including the mission and vision really creates behavior change from members because they want to continue to have the privilege to use the site. Some interesting things have happened, we've even seen some of our famous offenders, or infamous offenders, I should say, thank us for taking more of a role and they've changed their behavior accordingly, to be more collegial with one another. And we're also continuing to grow in our key benchmarks, like logins and discussions. So that's really important to us. And finally, while we haven't yet seen if we've moved the needle as far as getting more women and people of color to participate, we are looking for solutions in- house with our research team to see how we can evaluate that. So we're really interested in doing that and moving forward. But it's important to note that arguments will still happen and our role in those more contentious moments, is still being assessed, is whether we should monitor versus moderate. Our members have high expectations from our organization and we want to deliver to that effect by continuing to facilitate high quality conversations for their awareness and knowledge where everyone can participate.

Candace Freeman: And so there's this healthy tension that exists because we have a small team of smart, passionate people who are managing AAAS communities, as we navigate creating an ideal safe space for scientific discourse, while working within the constraints of being a nonprofit organization with limited staff and resources. And so we meet regularly and we discuss actual examples from our discussion thread. We look to our mission and vision statement for guidance as we calibrate our terms and conditions and constantly update our protocols and find that balance between what we would ideally like to do versus what we're actually capable of doing within the constraint, the resource constraints, that we have.

Emileigh Clare: And then finally, one key challenge and opportunity that I think all of us can work together and continue to discuss is more functionality, more technical solutions, that enable people to be empowered. To make community the experience that they want it to be. While the watch words and the inaudible appropriate features have been hugely helpful in our membership, and we actually implemented those right away, there are still some things that would be great on our end, and I'm sure others have similar ideas. More and more social media platforms are allowing users to mark misinformation, mute others who are being annoying, down- vote them if they're not giving good information. And members, users of social media platforms, are increasingly expecting those functions. So how can we provide similar controls to members to help empower them and really give them the spaces that they're looking for? So with that, thank you so much.

Alex Mastrianni: Well, that was great. They covered so many different topic areas and ideas for different ways to make your community safer for members. But I think the thing that I liked the most was how they talked about making their terms and conditions reflective of their mission and vision. I know so often when you think of terms and conditions, or at least I do, think of it more as a transactional document or something that you just need to check the box to be able to proceed to that destination that you're trying to reach. But really it can be a guidepost for, not just your members, but also for you and your organization when you run into issues or uncharted territory, and you need something to point back to so that you know, okay, how do I proceed here? What are we saying is allowed or not allowed? And if you take the time to do some of that, either upfront or over time, it can be a really great guide for you as an organization.

Heather McNair: To piggyback on that, I was really impressed that they have already updated that a couple of times. And I think so many organizations update it, they customize it to their organization, as they're getting ready to launch their community. And then they never look at it again. It's kind of set it and forget it, like you said, it's transactional. They check the box. But for AAAS, it is a living breathing document. They're going to continue to adapt it depending on what's going on in their community and their environment. And the other cool part about that is, you have the option every time you update that, you can make people, your community members, re- agree to it. And every time they do that, they're reminded, whether they read all of it or not, they're reminded of this pledge that they're taking to participate in the community. What they've agreed to adhere to. But yeah, there were so many great takeaways, that was such an awesome session. And if you've run into some of those challenges, hopefully not many of them, I think they have an extreme example. But I think we've all, anyone who's been in community management for any length of time, has at least run into a couple of them. But it's good to know you're not alone. And I think they've given us some great tips on how to deal with them.

Alex Mastrianni: For sure. If you are a community manager and you have a tip that worked for you, if you ran into something like this at some point or another, that we didn't talk about today, we definitely want to hear from you. So head over to LinkedIn or Twitter, we have posts up about this episode of the show, and please leave a comment. We'd love to hear from you and see what's worked, see if there's more that we can talk about on this discussion. So that's going to do it for today's episode of The Member Engagement Show, thanks so much for joining us and we'll see you again next week.


This episode is a recording of a session from our virtual user conference, Super Forum. To assist with monitoring and addressing inappropriate behavior such as trolling, overly promotional or harassing posts, the American Association for the Advancement of Science developed robust protocols, canned responses and enhanced T&Cs to help make our community a safer space for members, working with our in-house human rights group. In our session, we will share the ins and outs of our protocols, and our best practices for having a more harmonized online space and what we found our biggest challenges and opportunities were.

Today's Host

Guest Thumbnail

Beth Arritt

|Association Evangelist

Today's Guests

Guest Thumbnail

Emileigh Clare

|Assistant Director Member Engagement, American Association for the Advancement of Science
Guest Thumbnail

Candace Freeman

|Training and Quality Assurance Manager, American Association for the Advancement of Science