top of page
Indeed Wave.PNG
Parental Advisory.jpg
Color-YouTube-logo.jpg
Apple Podcast.png
Spotify.png

Social Background Checks w/ Bianca Lager

Chad Sowash

Social media is hotter than ever!


Okay put your phone down it's not time to check Tiktok for pet videos and NBA highlights! You'll want to hear all the latest background check trends with social media with the president of Social Intelligence Social Screening Bianca Lager.


Should screening social media be a standard hiring procedure? Is social screening even more important during the age of remote work? What about detecting bias and intolerance?


You'll hear answers to those questions and much more in this Nexxt powered podcast!



Bianca (0s):

The very nature, especially if social media, right? The social networking platforms, Facebook, Twitter, all this stuff. There's human contexts, all over there's language. There's culture differences. There's innuendos,


INTRO (12s):

Hide your kids! Lock the doors! You're listening to HR’s most dangerous podcast. Chad Sowash and Joel Cheeseman are here to punch the recruiting industry, right where it hurts! Complete with breaking news, brash opinion and loads of snark, buckle up boys and girls, it's time for the Chad and Cheese podcast.


Joel (33s):

Yeah. You know what time it is! Your favorite podcast, the Chad and Cheese podcast. I'm your cohost Joel Cheeseman, accompanied as always by Chad Sowash


Chad (43s):

Whohoo!


Joel (44s):

So today we are, we're just giddy and a little bit thirsty to welcome our next guest Bianca Lager president at Social Intelligence. Don't call it an oxymoron. Bianca, welcome to the Chad and Cheese podcast.


Bianca (1m 0s):

Hello boys. Nice to be here.


Chad (1m 2s):

Good to have you.


Joel (1m 3s):

Good to have you.


Chad (1m 3s):

That was a very sexy intro there, Joel. Very nice, very nice.


Joel (1m 7s):

Well, I've, I've got my Tuesday after Memorial day voice going on.


Chad (1m 13s):

So today we're going to be talking about social media screening, but before we do that, Bianca, how and in the hell did you get into this?


Bianca (1m 21s):

Oh gosh, great question. I was recruited. I was jumped in, I suppose I was fresh out of business school and a friend of mine, someone that I know through town and just kind of through other other friends started Social Intelligence, founded the company in 2010 and along the way, Social Intelligence became a bit distracted, I suppose, a lot of different product lines. One product line was for the insurance industry. One product line was for the employment industry and one product line was for government contracts and a long story short sold off the government contract stuff. It was kind of a mess, as you can imagine, and wanted to spin off the companies between insurance and employment sector.


Bianca (2m 3s):

And again, I was bright eyed and bushy tailed and had done a lot of sales and marketing work. Max Drucker, the founder of wanted somebody who could really commit themselves to kind of working out the employment sector stuff and reviving what was sort of a bit of an ignored product line for a bit. So we went to dinner one night and he said, Hey, will you come run my company? And as one does, one says, yes. So that's where I am now.


Joel (2m 27s):

You went to Pepperdine. Is that the most beautiful campus in the world or not?


Bianca (2m 31s):

It's close. I also went to UCSB as an undergrad. So it's hard. I'm a Gaucho.


Joel (2m 39s):

Is that the litmus test? For you like campus has to be on an ocean in order for you to attend?


Bianca (2m 44s):

Yes that was one of the reqs for me to apply.


Joel (2m 48s):

2010. I don't know the dawning of social media as a recruitment tool. I remember the days when people would not touch social media as a background checking pre-screening mechanism. I mean, the days of MySpace and Twitter, even you had a lot of fake profiles go up. You had people wanting to just destroy people with these fake accounts. Like when did that change that employers started to be open to using social media? Cause there was a time way back in the day where they wouldn't touch it with a 10 foot pole, what changed?


Bianca (3m 22s):

Well, they may be would in an official capacity or be searching for a vendor, but quite frankly, and what's very true and what's always been true, and still is true, is people love to Google people. And people love pull up that Facebook profile and check out. And as you guys know in recruiting, a lot of it and hiring managers have to do with, do I feel comfortable with this person? Do I trust this person? What can I ascertained about this person? What kind of story does this person have? Can I figure out? And so the reason Social Intelligence was founded in the reason we pioneered the industry is to say, well, that's not gonna work. That's not, that's not fair. And that's not really the best use of maybe that person's time, but also as a candidate from the candidate experience, you have no control over, you know, somebody kind of poking through your pictures and creating a narrative about you that maybe isn't true, or maybe is true.


Bianca (4m 14s):

And to your point, fake profiles, are you even looking at the right person or are you sort of making up a story without knowing who that person is? So certainly what we know to be still true. And what was true 11 years ago is that that's what was happening and that's the kind of temptation with our keyboard at our fingertips.


Joel (4m 32s):

When did it go from unofficial to official, I guess is the better question?


Chad (4m 36s):

When social media launched it.


Joel (4m 38s):

It was not official. That's what she just said. It was like, oh, I'm going to do it on my own, but the company wasn't going to buy services to do it.


Chad (4m 45s):

That's because they didn't exist. But drama is so appetizing for humans.


Bianca (4m 49s):

It is.


Joel (4m 50s):

And drama can get you into court too, which I know is a fear a lot of companies had with using social media.


Chad (4m 55s):

It's still doing it.


Bianca (4m 56s):

Right.


Joel (4m 56s):

Which goes to back to my question Sowash, when did it become official from unofficial?


Bianca (5m 1s):

When Social Intelligence launched in 2010? That's was


Joel (5m 4s):

There's the answer. Wow!


Chad (5m 7s):

Where's the buzzer?


Joel (5m 8s):

When we launched is when it became official.


Bianca (5m 11s):

Well that was the intention, right? I mean, that's the thing, but I think you're making a good point. I mean, it was an uphill battle those first couple of years and kind of a reason why Social Intelligence had different product lines. It was trying to get that thing going and get, knowing that, Hey, this is probably the better way to do this. And for the employer standpoint, the candidate standpoint, you know, take a look at it and maybe take it seriously about what kind of data and information you should or should not be using in this kind of decision.


Joel (5m 36s):

Do most states have laws where candidates have to be aware of the background check if it's social in nature?


Bianca (5m 43s):

In 2011, soon after we launched, a couple of people from the industry were like, what is this? And had that kind of same reaction that you're talking about, like, oh, this is, you know, going to legal problems and this isn't going to work. And so the federal trade commission who is tasked at the time of regulating consumer privacy laws federally, came in and did an audit of Social Intelligence. They said, what are you guys doing here using social media for employment decisions? What are you guys doing and how are you doing it? So they did an audit. We had to answer questions to the Senate Privacy Committee, many months and a lot of lawyer fees later, essentially what you can do is Google Social Intelligence and the FTC. There's an opinion letter that they put out and they said, okay, Social Intelligence acting like a consumer reporting agency and as such, you have to abide by FCRA.


Bianca (6m 32s):

So, so far you are, we reserve the right to check back on you later cause you're a consumer reporting agency and that changed everything. And now competitors that have come and gone since then have used that very same sort of precedent to say, well, the way that you make it official is you make it a background screening report and you have the same rules as criminals, credit checks, all those kinds of things. You apply that same standard and that same process. And that's how you legitimize it.


Chad (7m 0s):

So do you carry that letter around with you everywhere you go?


Joel (7m 4s):

It's laminated.


Bianca (7m 4s):

I mean, I talk about it a lot. So luckily the inter-webs does that for me. So yes, it's, it's not bad.


Chad (7m 15s):

I think this was actually One of your quotes. "All it takes is one tweet." So seriously I sent one dumb drunk tweet and I'm fucked. And if we had social media back when I was a kid or even in college for goodness sakes, or before that, I mean, there's so much history that's there that is available today that could really just tank. Companies are looking for reasons, right? And all we're doing is now giving them more reasons. When does this stop? Where's the line?


Bianca (7m 46s):

No. So they're not, they're not actually just looking for that one little dumb off-color joke that you made or the time you use the F word or something. You know what I mean? This is just,


Chad (7m 56s):

You saw that.


Bianca (7m 57s):

I saw it. It's just not a good use of anyone's time, nor is it something that is depending on what you're saying, of course, but you know, when I say all it takes is one tweet. You're talking about like a tweet that is extremely impactful. As in the preschool teacher, who's talking about how they don't like Jewish people and they should die for X, Y, and Z reasons. Okay, that's a problem, right? You have, you have kids in that person's care. And now we have a problem. However, you know, it's just not on the same level. And it's comparing two different things to really say I drunk tweeted a picture of myself, doing something dumb, you know, when I was in college or something.


Bianca (8m 38s):

And that's really the difference too. And that's, I think really the key point in that idea of your boss, like pulling up your Facebook and being like, oh, this guy looks weird. No, thanks. I'm not going to interview him. Like, you know, we have biases, we can't help it. But when you put a set of criteria out there that is business related, that is something that actually relates to a code of conduct policy actually relates to something pretty serious. That employer is forced to really deal with that one way or another. Then you really are leveling the playing field. All of our clients, they don't want to do more work. Okay. They don't want to like shuffle through more posts and more pictures than they need to, but they are concerned about the stuff that is going to be impactful towards their other employees, towards their clients, towards society in general, that is putting pressure on them to make sure that they're keeping folks safe.


Bianca (9m 25s):

The idea of really is then how do you do that? Both from a technology platform perspective and just from a practical perspective, how do you put that product together that is useful? And how does that process actually work? Because to your point, you know, if you're just doing a command F yeah. Then maybe some stuff is going to be on there. That's like, whoa, that's not fair. That's not even what I meant. You know? And also when you're doing it behind the scenes, you're not getting the candidate the opportunity to say, well, Hey, wait, let me give you some context around this. You know, let me dispute this. I need to, there's actually a process to say, Hey, that wasn't me. Let me just speak to somebody, talk about this with you, rather than it just being like, Nope, you're done. You're out of here. You know.


Chad (10m 5s):

So how do you guys scale with technology, because you know, using Facebook today, obviously it's hard for them to contextualize sarcasm. I got, I think kicked off Facebook one day for saying Jane, you ignorant slut from Saturday Night Live. I mean, there's, there's all of these things that come into play, especially when we're talking about scale of this magnitude. Now, obviously you don't have to scale to the Facebook magnitude. You have to just do it person by person. So how do you scale number one and then how do you make sure that you're getting the right Robert Smith or Jane Doe?


Bianca (10m 41s):

Yeah, for sure. And, and the truth is I have the same problems as Facebook. And so if I wanted to sound really cool and, and impressive, I would probably say, well, we are a platform. The AI we're just really sophisticated over here. And sure. I mean, obviously we're using technology here. Obviously we're building algorithms here and we're applying data analytics so that we can do our work as fast as possible. So is Facebook. So is Twitter, right? We are all trying to do these things as best as possible. However, the very nature, especially if social media, right, the social networking platforms, Facebook, Twitter, all this stuff, there's human contexts, all over. There's language, there's culture differences. There's innuendos, there's sarcasm, there's all sorts of emotional language that people use online.


Bianca (11m 25s):

And so we do the same thing that Twitter and Facebook does. We hire people to double check and to make sure that our algorithms are doing things correctly. And that those definitions of content is falling under the right legal definitions that our client needs to see reflected in their policies. And that we have had vetted through several, you know, legal people who know much more than I do about things like that. And really that's the answer. I mean, and to your point, yes. Facebook has to hire tens of thousands globally, all. I mean, it's, it's a much larger issue for us. You know, we have to do that proportionally to the work that we're done and really the key there and, you know, secret for anybody who wants to compete with me.


Bianca (12m 5s):

The key really there is to something that we talk about a lot, which is called augmented AI, right? You're building platforms to make people do their work faster. We spend a lot of time thinking about how to find the data. That's one thing which is, you know, machine learning, facial recognition, software, things like that. But there are there, isn't just one answer to that. If I had the answer, mark Zuckerberg, I'd be talking to him and not you guys, you know what I mean? He needs that too. And I know for sure we're not there. I don't know how that would ever be solved because of the sort of emotional nature. Now to your other point, as far as identity, it's the same kind of problem, right? It's a little bit easier to solve actually though, because there is evidence, so to speak on the profile to where you can build an evidentiary case, that's legally defensible to say, Hey, this is the right John Smith in Los Angeles.


Bianca (12m 53s):

What we base that off of is the information that we get. So just like a criminal report. And I know you guys are familiar with those, there's errors. There's sometimes come back, maybe, you know, we didn't have the birthdate, right? So this is John Smith Sr. not John Smith Jr. There's things like that. That can happen. If there's enough data points that we have, the more data points that we have in generally, it's what you find on someone's resume. The things that people talk about online are their name, where they live, where they work, where they went to school. So that's the stuff that we use to identify if this is the right one. And then there's a series of sort of patterns of behavior that people use online across multiple platforms sometimes that are, you can post the same picture to Instagram and Facebook at the same time. So there's patterns of behavior that you can identify as natural and organic versus something that is suspect to be fake.


Bianca (13m 38s):

What we do sometimes find is fake profiles that we think that the candidate, the subject doesn't know exists. Those aren't fun days, you know, that's, it's just so you're just like, oh my God, this is just what a cesspool of nonsense. But it happens.


Joel (13m 53s):

When I think about the, some of the threats to your business. One of the thing that comes to mind is sort of the changing nature of social media. You know, when you started 10 years ago, it was basically Facebook and Twitter.


Bianca (14m 4s):

Yeah.


Joel (14m 5s):

That's fairly easy from just, oh, it's text or it's something simple.


Chad (14m 9s):

TikTok baby.


Joel (14m 11s):

We have stories now, Chad. you know, Snapchat, disappearing messages and TikTok is I guess, sort of all over the place, in terms of music and all kinds of stuff being thrown in on it. Are you, are you in businesses like yours prepared to continue doing the same service as social media changes? Or are you going to have to make significant changes in how you do business in the future?


Bianca (14m 36s):

No you absolutely. Nail it, you know, call it SWAT analysis. That's absolutely one of the things that's a threat that we have to keep our tabs on. We have to stay really close to how all the platforms change the user settings, change their privacy settings. We to make sure we are being in compliance with those things. And so what we know is that, and especially over the time period that we've been doing this, those things do change all the time. One of the things that platforms are very interested in is the core belief of building community and building a place where people can share information. So the basic nature of that is not likely to change significantly. However, there's some landmines there. We definitely have to walk through them and figure out how best to provide the most comprehensive information in a cost efficient manner for folks.


Bianca (15m 24s):

And really that comes down to, okay, where's the biggest risk? Like what are the content categories that create the biggest risk and then where are the places that are most likely to cause them? And that's the top-down approach essentially that we take.


Nexxt PROMO (15m 42s):

We'll get back to the interview in a minute. But first we have a question for Andy Katz, COO of Nexxt. For clients that are sort of married to email what would you tell them in terms of the metrics versus text messaging? It really depends on the audience you're trying to reach. I'm not going to even tell you text messaging is the right tool for every type of audience. You know, you're not going to reach, you know, a VP or senior level person necessarily through text. You're going to reach more of those hourly workers, more of the gig economy, more of anybody that's on their feet all day long. So again, you know, you got to break out email and texts in two different categories. And sometimes depending on the audience, the best thing to do is hit them with both of them.


Nexxt PROMO (16m 24s):

And it reinforces the message, the brand that's coming across, they'll know who the company is, and it's like any other commercial or podcast, you know, you might have to listen to it a few times before it resonatesv and it sinks in. I believe it's the same thing with a text versus email versus any other form of communication For more information, go to hiring.nexxt.com. Remember that's Nexxt with the double X, not the triple X hiring.nexxt.com.


Chad (16m 55s):

So you talking about landmines, I see social media as a snake pit of bias. How can you guys de bias the process?


Joel (17m 6s):

Snake pit of bias. That's good Chad.


Bianca (17m 9s):

That's a new one. Yeah. Creative points there. Yeah. Snake pit of bias. I might take that one, put it on our website. Are you in a snake pit of bias? Great question. And something that, especially, especially, you know, the 2020 of it all right, that we've really had to deal with. The categories of content that we focus on for most of our clients is intolerance. That is a very broad word. Can really make people feel a certain way.


Chad (17m 38s):

I contextualize that.


Bianca (17m 40s):

Right. And so just Social Intelligence are we suddenly sort of in the position of being the internet, police? Being the intolerance police? Being the racism police? And that is something that we don't want to be. We don't want to be in a position necessarily to do that. So what we do is look to folks who have taken that on in their organizations and we use those as credible sources. So what I mean by that is we have something that we built. We're very proud of it. We maintain it and update it all the time. And we have folks that are focused on making sure it's accurate. We have something we call the Social Intelligence intolerance database. What that really is, is a series of definitions of hate symbols, hate groups, hate speech that has been defined by other organizations.


Bianca (18m 29s):

So organizations like the FBI, organizations like the CIA, organizations like the Anti-Defamation League and the Southern Poverty Law Center and places that what their sole purpose is, is to define these things as well as possible. So we accumulate all this stuff. And so when we're both informing our technology and informing our analysts who are actually double checking the information, those are the things that we referenced. So if something is on the report that says, this is potentially intolerant, it comes also comes with a set of definitions that says, see this source, you know, they have defined it as this. And so that's how you can contextualize at least some of those things that come back, most of them.


Joel (19m 6s):

That you listened to our show frequently, you might know that Jeffrey Toobin is a popular subject on the show. For those who don't remember, Jeffrey was a reporter who thought it was a good idea while his webcam was still on to, shall we say pleasure himself in a snake pit of bias. Curious about the work from home phenomenon, things that you saw in the business during COVID, did everyone lose their mind or was it just Jeffrey? Like what kind of shit did you see in terms of frequency everyone's at home? I'm sure that stuff got a little bit crazy. Talk about that.


Bianca (19m 40s):

Yeah. No completely, Jeffrey RIP. What's really interesting is that towards the end of the year, we really thought about this and we're like, well, what happened? You know, sort of call it pre pandemic and post pandemic, like what was BLM movement, everything. And then being at home and everybody kind of being very stressed out, we wanted to know what kind of behavior change we could ascertain. And, you know, we fancy ourselves sociologists to some degree, we spend a lot of time looking at the way people are expressing themselves online. And so we created our 2020 report to kind of reflect that like, okay, what was pre pandemic, post pandemic? What were the things that came up the most? I personally, if I, you know, it was going to put money on it, I was like, it's gotta be intolerance, right?


Bianca (20m 23s):

Like there's got to just be people insulting each other more, you know, it's just, that's how I felt, but it was actually violence and a subcategory there in of people threatening others online. That was the one that spiked the most that had the most significant change. And it surprised me in retrospect, of course, it really made a lot of sense because if you think about that idea of the stress and everybody really only having these social media outlets to express themselves, it makes more sense that people felt that kind of pressure and we're releasing it in some sort of way online, you know, a lot of like, I'm going to kill you. I'm going to slap you. I do kind of like ranging from like stuff that is clearly just people getting things off their chest to people really being like, I'm, you know, gonna run this person over with the car, like or pictures with weapons and it was intense.


Bianca (21m 14s):

And so I think that explains some of people losing their minds, people just being very stressed out.


Chad (21m 20s):

So do you have clients actually ask you, let's say for instance, could I ask you as a client to say, Hey, look, I don't want any of those Trumpers, can we go ahead and just weed them out? Can they pretty much customize the search and background that you're looking for?


Joel (21m 36s):

Yeah, the snake pit of bias?


Bianca (21m 37s):

We probably wouldn't do that. It would really depend on the situation. So generally speaking, we have four general broad categories and then people do customize within. So it's intolerance violence, potentially criminal and sexually explicit activity. So generally speaking, we have a role that we don't apply political affiliations in employment decisions, unless an organization sort of has a reason for it, actually in 2018, someone from the Trump organization contacted us wanting to run these reports. Listen, I, you know, the Trump organization is very broad as a broad brush.


Joel (22m 12s):

Don't don't moonwalk on us, Bianca, let's go, let's go here.


Chad (22m 17s):

Was it Brad Patschale?


Bianca (22m 19s):

Truly it kind of felt like a guy that was like trying to get in. And he had this idea to help with the campaign for 2020. And the idea was though, and to be fair, I think to organizations that do have a political reasoning, they wanted to kind of see if there are anyone who's being a double agent, right? Like if they were, if the someone was going to be a false actor and join the organization, but really they're just there to kind of mess things up. It was kind of a second where we took a pause and said, well, you know, technically they have a reason and a lot of these people were actually volunteers. So the situation wasn't necessarily an employment one, but it mimicked ones anyway, complicated stuff. You know, I don't think this person actually had any influence over the campaign.


Bianca (22m 59s):

And again was just somebody who was in one of the state caucuses, I think, to try to dredge something up. But the idea of it was, I mean, generally we kind of shut. They were like, we don't do that. But the idea, I was like, well, they actually have a reason for something like that now.


Chad (23m 14s):

Yeah. You could have been the next Analytica. It could have been Bianca Analytica.


Bianca (23m 21s):

Yeah. Yes. I talked a lot about Cambridge Analytica and yeah, I exhausted that conversation. Right.


Joel (23m 28s):

Employers have a lot of options when it comes to screening and background checks. There's obviously you mentioned criminal drug screening and now social screening


Chad (23m 37s):

She mentioned criminal, and then she said, you guys know this, so I just want to put that out there.


Bianca (23m 41s):

Well come on


Joel (23m 43s):

I'm glad that you mentioned that Chad. Where does social screening come in the hierarchy of screenings? I mean, it used to be sort of a nice to have, but not essential. Are you moving more into the essential with the drugs and the crime?


Bianca (23m 58s):

Moving there certainly, and it's interesting because both drugs and crime now in this brave new world we live in, they have a lot of pushback, right? There's a pushback and there's some laws changing in terms of fair chance hiring, ban the box. And these movements of, Hey, you know, is this really fair? When do we do this? What kind of questions, what kind of criminal activity can really be held against somebody? All of these changes are really lending itself from a, you know, if you think about being a person who's in the business of selling background screens, it's sort of an opportunity to have alternative data sources, but in general to the appetite for some of these corporations and companies saying, they're trying to actually not just distance themselves for the sake of distancing themself but all organizations want the data that's going to reflect the best and most productive workforce.


Bianca (24m 48s):

That's what they're after. Right? And so people are starting to think more broadly about that and changing some of their more traditional criminal and drug testing and things. And then of course, with again, the year that we've had the Capitol riots, I think it has really led to a peak point here in understanding like, okay, this really is making a big difference and we really might need to pay attention to this. So they're looking for things to say, can't do it Willy nilly. Like how do we do this in a legit? How do we legitimize them?


Joel (25m 17s):

I'm so glad you mentioned the riots. And my last question, I'll let you out on this is, I'm a big fan of Black Mirror. And I'm sure that, you know, the episode where basically everyone in the future is tied to their, I dunno, social equity, their score, how many likes.


Bianca (25m 35s):

Right.


Joel (25m 35s):

Everyone has a score. And I want you to in part, scare the shit out of me. But also I want you to say some stuff that I can play back to my teenagers to be very conscious of this. Like how far does this social checking, if you will go, I mean, are we getting into college. Will this be part of that three years from now, when I'm going into a senior living center, are they going to check my, my social social profile?


Bianca (26m 1s):

Oooh, new market for me, thanks.


Joel (26m 2s):

You know, if somebody wants to get an apartment or a mortgage, or like, how far does this nightmare go?


Bianca (26m 9s):

Yeah, well, I would say, well, first of all, every time I meet someone on a plane or used to an elevator or whatever, and they say, what do you do? The next thing they say is, can you come talk to my kids? So it's, I'm well versed in that set up, but yeah, no Black Mirror episode that was terrifying. And really, I think though, a really awfully, obviously important thought provoking concept of how much credit are we giving these kinds of platforms and how much are we relying on them outside of personal interactions, things like that. So my reaction, my gut reaction to your question is, do you have a have a choice? How much you are going to be giving these platforms and how much information you are going to be giving them about your life and everything like that.


Bianca (26m 52s):

Now, of course you could say, well, marketing and things, I really don't and you have to promote yourself or all these kinds of things. And that's true to some degree. However, I think that as a society, those are very extreme examples of things that could happen. I think it's a slippery slope though and I think it's a fair one to think about people's own confidence levels and people's just reliance on what kind of meaningful newness you get from social media, both in personal use. And then from an employment standpoint, I think that there has been certainly credible, actual risks and threats that make that something that is necessary for employers to look at. Employment is while certainly something that people should have as many opportunities for as possible.


Bianca (27m 37s):

These companies have the decision making control to decide whether or not you get a job, things that are a little more fundamental to the pursuit of happiness here, in terms of housing and stuff like that I suspect to be a bit more of a dangerous category in terms of understanding what the limitations of over here, you can live here and you can't live here, things like that. So I think we just all have to be vigilant as a society and keep boundaries around both your own personal, the way you feel about your online social media profiles. And then also, you know, how meaningful this data is going to be to applicable circumstances. Now, that being said, the reason that employers need to deal with this data is because there are folks out there that are absolutely abusing it and tarnishing it and putting very dangerous and upsetting content on there that is very disruptful to a productive society.


Joel (28m 33s):

Are college's doing this?


Bianca (28m 35s):

Not in the, mostly for like employees like staff and faculty. We've seen some, you know, toe started dipping in the water, especially for like a sports program, especially the big money sports programs, let's say.


Chad (28m 48s):

Yeah.


Bianca (28m 48s):

But I think that that's also probably cost-prohibitive for them in some ways, you know, just to kind of understand what the profile is. I mean, it's a whole thing you have to, it's not just the cost of the report, which is, you know, about the cost of other types of background screens. So it's not the actual cost, but it's actually taking that adjudicative decision-making on. And how you play into that, how it takes a role. Also, you know what I mean, this content, those are minors, right? I mean, the content that they're creating then is minor. So I think you would probably get a bit more legal conversations at the very least about what is, and isn't okay there. All that being said with a, you know, I talked to high schools, you know, in our local area and I talk about it and it's the best advice you can give is, you know, look at those big, scary categories that I talk about all the time.


Bianca (29m 34s):

Okay. Like stay away from that stuff. And it sounds trite and it sounds silly, but be kind, you know, if you're there insulting folks, it's not necessarily going to get you the attention that you might be craving. So just be wary.


Chad (29m 49s):

You heard it first here, kids. Be kind.


Joel (29m 53s):

Bianca Lager, everybody.


Chad (29m 55s):

From Bianca Lager. Bianca if our listeners want to check out the little Social Intelligence action, where would you send them or maybe even connect with you, where were, where should they connect with you?


Bianca (30m 6s):

So certainly you can find me on LinkedIn and Twitter and all those good places. I'm trying not to create too many storms on those platforms, but otherwise Social Intelligence has a really incredible blog and we create a lot of content and we really think deeply about all these kinds of social issues and how we play a role in that. So that's really interesting. Also podcast listeners, we are providing a special discounted offer to you. So if you go to socialintel.com/podcast, we'll hook you up with a free sample report and then lifetime wholesale pricing. So get on it.


Joel (30m 39s):

Discount everybody.


Chad (30m 40s):

Wholesale pricing. I love it.


Joel (30m 44s):

Everything must go.


Chad (30m 46s):

It's a snake pit of clearance items.


Bianca (30m 50s):

I really liked that.


Chad and Joel (30m 53s):

We out, we out.


OUTRO (31m 51s):

Thank you for listening to, what's it called? The podcast with Chad, the Cheese. Brilliant. They talk about recruiting. They talk about technology, but most of all, they talk about nothing. Just a lot of Shout Outs of people, you don't even know and yet you're listening. It's incredible. And not one word about cheese, not one cheddar, blue, nacho, pepper jack, Swiss. So many cheeses and not one word. So weird. Any hoo be sure to subscribe today on iTunes, Spotify, Google play, or wherever you listen to your podcasts, that way you won't miss an episode. And while you're at it, visit www.chadcheese.com just don't expect to find any recipes for grilled cheese. Is so weird. We out.

Comments


bottom of page