17 min

AI Notetakers: what you need to know about Security & Privacy. ft. Jason Makevich & Lily Li

By David Lee

 

 

 

 

Intro

Dave: As a marketer, one of the AI tools that absolutely saves me time and increases my productivity are these AI transcription tools.

You've seen them everywhere between Zoom, Otter AI, Riverside, Descript, the list goes on and on. And a lot of them offer their transcription services for free.

Who doesn't really like free stuff, but there are some things that we should really know about from both a privacy and a legal point of view, when you do use recording and AI transcription services, when you're on calls with you and your participants.

So today I'm joined by Jason Makovich, a security and privacy expert and Lily Lee, an attorney specializing in privacy and security.

Welcome both.

Lily: Thanks for having us

Dave: So why don't we kind of just start out with an introduction, Lily, why don't you tell us a little bit about yourself

Lily: Sure.

My name is Lily. I'm the founder of Metaverse law, and it's a boutique firm that focuses exclusively on cybersecurity, AI and data privacy law, and we work with funded startups all the way to public companies.

Dave: And Jason, why don't you tell us a little bit about yourself.

Jason: Yeah, sure.

Jason Mankiewicz here with GreenLight Cyber.

We're a managed cybersecurity company in Southern California, clients all across the country. So I'm a CISSP, which is a certification in our industry for cybersecurity. And, provide a lot of advice to small businesses, mid sized businesses as well around cyber risk. My team provides all the managed cybersecurity solutions for them.

How do AI note taking services work?

Dave: Very cool.

So Jason, why don't you kind of start us out with from like a technology standpoint, kind of explain to us lay people, how does this how do these like AI transcription software, you know, services, how do they work?

Jason: mean, they're pretty great. You mentioned in the beginning. I mean, we see 'em all the time now.

I love 'em for a lot of reasons. You know, it makes our lives a lot easier.

We're in a lot of meetings and it's great to have summary of those and next steps, right? And so it does a good job of capturing those.

So it works by joining the meeting as a bot and it listens to everything going on transcribes use large language models to take that information and make sense of it. And then it can summarize it. It can also interpret things like let's say we were in a zoom call right now. It might say,

Hey, David mentioned he was going to do these three things.

And Lily mentioned that she would do these and she also told Jason to do these.

And so then it can actually give you kind of those summarize next steps, which awesome. It can give you know screenshot recording summaries and everything else so really, really powerful tools.

Many of them out there that are great but I know we're here today to talk about some of the things to think about when when using those.

Dave: So does the technology, I mean, does it run locally on my computer or does this stuff all somehow get shoved into this mystical cloud?

And we don't have, possession for lack of a better word of that conversation and data.

Jason: I mean, it's, it's more in the cloud, right?

So it's a bot that is cloud driven, joins a meeting from the cloud.

There's nothing really local about it per se and I guess it could

be mystical to some but basically, it's all cloud driven,

By whatever that provider would be.

So, typically they're going to be operating in things like. Amazon web services,

Microsoft Azure, Google cloud platform. One of those generally.

AI Notetakers - Who owns the data?

Dave: then Lily, from, like a data and ownership perspective, it's not like it's on our local computers anymore. It's somewhere out there in the cloud.

What do we need to think about from that perspective?

Lily: Yeah, for sure.

So when I take a look at AI note takers or any type of AI providers, I'm really careful to check their terms. There's data ownership and also data licensing.

And so the idea is that you're in a contract with the AI note taker.

And you're granting them either ownership of all the data that you're providing in the meeting, or you're granting them a license to use the data.

The terms will decide whether or not the license is limited so that they're only using your data to give you the services and nothing else.

Or if it's a really broad license, so your data is actually being fed into their systems to improve their services or for them to bundle up and sell or share elsewhere.

How are AI transcription services using our data?

Dave: Wow. So that was at, when you first described that to me a while back, I was kind of like, Oh, what do I need to look at? So let me, I guess my question is, what do we need to look at to understand how these AI transcription services are using our data?

Lily: I'll definitely check out the terms and definitely check out the privacy policy.

You'll probably notice as you're signing up for these services that there are free or freemium options.

And then there are business or enterprise options. Generally, there are 2 different sets of terms. And if the product is free, then more likely than not, your data is going to be used and input into the services themselves.

Also, check the privacy policies. If they are legitimate, first of all, they'll have an updated privacy policy, and then they'll go into how they use the data and how they share their data.

One thing to note is that regardless of which service you use, whether or not it's free enterprise there will always be caveats for law enforcement and other types of requests for your data.

And so, that's why, when, when we think about use cases for these AI note takers, there are certain highly sensitive cases that I can touch on further that shouldn't be provided to AI note takers at all.

Because in any circumstance, the AI company can provide this information to legitimate law enforcement requests.

When should you turn off AI transcriptions tools?

Dave: Well, not that we're proposing to do anything illegal here, but what, what are, I mean, what are some of the specific use cases, as an, can I actually even say this as an attorney, would you recommend not using transcription services if you're on a zoom call or whatnot?

Lily: Yes, and so again, this is just general information, this is not legal advice. No one listening to this is getting a bill.

But there definitely are certain use cases where I'd avoid a note takers.

One of them is you know, conversations with your attorney. Again, generally, you're seeking legal advice. You're seeking information about what's okay. What's not and you'd prefer not that information not to be shared with the 3rd party or with other companies.

Again, depending on your use case, if you have really sensitive trade secret information this might be the time to set down the AI notetaker, because again, let's pretend you're in a big tech company and it's another big tech company's AI notetaker.

Do you necessarily want to disclose to them how you're developing your product.

Dave: That's a great point.

So anytime in conversations with an attorney. Think twice, as well as, in internal IP, be careful on when you're using that.

So, I mean, that, that starts addressing things from like a cloud services side.

Data & Security concerns with summarized notes

But Jason, we also talked a little bit about, it's not just that another company has that data. Typically we talked about, when a meeting ends, you get these nice summaries and these nice emails.

Maybe can touch a little bit upon that from a privacy and data security standpoint.

Jason: Yeah.

I mean, that's one of the things that worries me actually is I remember the first time I, learned about these, bots maybe a year or two ago. I didn't even realize there was one in a meeting.

I mean, it's hard to know sometimes, right? And, and I was in a meeting with I think a vendor of ours.

And after the meeting we got, I got this email and it was summarizing the meeting and it was pretty cool, but It was an email and I could forward that to anybody.

And then email is inherently insecure platform. It was not developed in really the way that we, the ways that we use email. And so we're constantly trying to chase security for email, which just inherently is insecure, right?

So now you've got potentially sensitive information on a platter for, you know, who's ever, who's ever eyes are on that email. And if that email gets forwarded, or the email mailbox gets compromised, or goes into some shared mailbox because you've signed up with zoom on some other account or whatever.

Who knows where that's going to end up, right? And then that gets stored in email.

Well do you ever delete those? Do you ever go back and purge your emails from it? So how long is that going to be there? And who else might end up getting access to that?

And so there's just so much to think about.

It's like you're in a meeting, you think it's just confined to this space and these people, and then all of a sudden now you have these captures of the meeting that go into this you know, email, a bunch of email boxes, no less, right? Everyone that was in the meeting generally.

So that's kind of a big fear or risk that I think about.

You know, I think the other challenge is the overall, is it, you know, how do you ask, I always wonder, how do you even ask, hey, can we turn off the note taker, like, it's not mine, it's someone else's, right?

So, I think we all need to kind of step up and say, hey guys, you know, we're starting to, to Lily's point, we're starting to talk about trade secret or something important.

Why don't we go ahead and stop recording this and take this, you know, offline effectively or just live. So just things to think about, but the email part, it really leads to that potential of data leakage that I worry about.

Dave: Yeah.

Something definitely to think about.

Beware of transcript bots joining automatically

And then, everyone has an AI bot, right?

So we've all joined zoom meetings and we see all these different bots pop up and which bot wins. One of the questions I has, how do you know it's an actual bot? We talked a little bit about that.

Cause I can just rename myself from Dave Lee at the, at the zoom and I can just say, Oh, I'm Dave's note taker.

Jason: Absolutely.

I mean, that's a good point. You have to be really pay close attention to who's in your meetings. Right? A lot of times I'll be in a meeting. I don't even know everyone in there. Of course. Right? So. It's one of the, it's like wedding crashers.

It's like, everyone just kind of assumes well, they're with the bride or they're with the groom, right? And that's kind of how, how these meetings are probably half the time.

You're not going to know everyone in there and everyone's going to assume, well, part of the other party or someone else.

But now with the note takers, yeah, I mean, we were talking about this the other day. What would stop someone from sneaking into a meeting, but renaming their zoom account to, you know, one of these products and keeping their camera off and staying on mute, they'd probably look identical to one of those.

Jason: So, yeah, I mean, it's just things to think about and anytime you're, dealing with sensitive information in a meeting and you think it's private.

Hey, very close attention. Who's in that meeting. both human and bot.

Jason: Yeah.

Dave: I'm certainly guilty of this.

Sometimes it's just more from a, let's see, how should I say this? A lazy slash convenience standpoint. Where you initiate a zoom meeting and you're like, ah, whatever, just let anyone in.

But what you're saying is really, you know, pay close attention to who you're actually letting in. So putting people into a waiting room and making sure that they are who they say they are is probably a good best practice around there.

So, in terms of Lily for. From, I don't think this is asking legal advice, is it?

But from, from a legal standpoint, are there things that we have to be concerned about when, we're just firing up, zoom or an AI bot in terms of, I remember it's kind of like, way before the pandemic. Right. I always used to say like, hey, is it, is it okay if I record this meeting?

Mostly from a permission standpoint, because I want to make sure people understand where we're recording.

Some states require consent before recording

But, are there other considerations beyond that in today's, you know, post COVID everyone's using zoom and do we still need to ask that, have that permission structure or is it automatically assumed, do we need to be concerned about that state by state?

I don't, I don't know anything about that.

Lily: Sure, yes,

So there are quite a few states, including California, where it is an all party consent state,

which means that all parties to a phone communication, a video communication needs to consent to the recording.

Zoom is very interesting because they've actually been through the ringer

a few times with different types of privacy class actions.

For instance, they actually paid out an $85 million dollar privacy settlement as a result of a class action, claiming that they inadvertently allowed third parties to come in and intercept calls or record calls without the parties consent.

So this is a real issue.

You'll notice that a lot of platforms have now a recording notice or some other recording alert for kind of the default platform recording.

But that doesn't mean that it covers your own use of an AI bot or an AI recording system if it's not integrated with the platform and so it doesn't trigger that notice. So, again, best practice, still a good idea to alert people what you're doing.

Another thing to be aware of is that there are AI laws that govern notice and consent requirements if you're using AI for certain activities.

For instance, if you are engaging in interviews and you're using AI in order to assess an interviewee. Then you may have certain notice requirements in certain states.

And now with the new EU AI Act, there's going to be requirements going forward to notify individuals if you're using AI, again, for interview purposes, performance evaluations, testing and other things that might impact an individual's rights and abilities and access to services.

Dave: Oh, wow. I didn't even think about the EU and this is probably a topic for another day, but the whole GDPR and the privacy insecurity along with the California CCP, I'm sure there's, there's stuff that governs around that.

So that's really good information to know.

From your experience, has there been any, issues in people using AI. And this is more like within the public domain, right?

But are there things that we can take a look at and that makes us, that we can get smarter on in terms of how people or how zoom was in a class action suit or any advice in that perspective, what we need to start thinking about.

Lily: I mean, definitely from a legal and privacy perspective.

At least in the United States, there's a lot more permissiveness around using AI and around using people's data as long as there's notice regarding it.

And you're giving individuals the ability to opt out of such recording and giving them certain rights to their data.

Again, if you're going abroad, and you're looking at the EU, then that's a whole separate conversation, and it becomes more about opting into a process.

Dave: Got it.

Jason, in terms of selecting a particular AI transcription tool, is there anything that we should probably think about besides looking at the T's and C's and privacy from a technical standpoint, when we decide to use one platform or another? Cause I mean, there are so many of them out there.

Yeah, I would say

Risks of using free transcription tools

Jason: Lily touched on the whole freemium thing. If you're not paying for a product, then chances are you're the product, right?

That's the case with so many things out there and in our digital world. Stop using free stuff.

I mean, look at the terms and conditions, look at the privacy policy, just look at, compare what you get from the paid versions of things and just understand what you're dealing with.

Sure. There are certain use cases where the free version of something makes sense, including I mean, if all you're doing is talking about, you know, you're talking to friends or whatever.

And yeah, fine. But I mean, gosh, in my business, there's no way we'd be using something that might compromise you know, the privacy.

I will add further outside of the meeting assistant, but in the, under the guise of the AI bot within the meeting, there are tools out there that are marketing as noise cancelling tools for the products we use like Zoom and Teams and all that, noise cancelling software.

Go buy noise cancelling hardware, because when you start, and I'm not naming any products, but when you start looking at the privacy policies of these noise cancelling products out there, really anything that joins a meeting or intercepts the audio on the meeting, there are free versions out there that are they're allowed to in the privacy policy, some of them, they're they own that data now.

They they're allowed to do anything they want with it.

Potentially. I don't know all the legalese, but so pay very close attention to the software that you and everyone in your company uses.

Corporate governance and use of AI tools

I think the real answer here is around governance and having governance around the use of AI and software in general technology in general in your organization.

There needs to be governance policies established, talked about, and they're not policies to put in a drawer.

They're policies that , should be really worked on and really, really thought through and then enforced throughout the entire organization.

And there are ways to enforce. to the point where it would be very difficult, if not impossible for users to use something that they shouldn't use.

But it starts with a conversation with the leadership of the organization and usually some experts outside of the organization like Lily on the legal side, myself on the technical side, or folks like us that can help guide, help ask the right questions.

And ultimately come up with you know, what's going to make sense for you and your business.

Dave: That's a really good point on governance.

Back in my old it days and you know, Jason, what that was way back when, I mean, you can enforce policies so that you can have your, company computers locked down.

So they, you know, the end user can't install things.

But with remote work and people using their own personal computers it really becomes less of something of that you can actually Enforced from a policy perspective, but it's also a, I don't know what you, what you guys call it today, but like a socialization standpoint, because if I'm at home and I'm on my own personal PC and I decided to install some software, but I'm using it for business purposes, it's yeah, there should be a governance process that when you use any asset for a company purposes, it probably should go through the, the it department and have been vetted and approved.

Not just for us. It doesn't crash your computer, but

Jason: And not just going through the IT department, but have the business leadership working with IT to define what should be and should not be allowed.

And having generally outside expertise guiding around that because most IT departments may not have that experience or perspective.

And, you know, if they're working in that IT department every day, they're not going to necessarily know what else is going on and all these other environments are working with, you know, providers or, you know, attorneys or anyone outside that does that for a living, that really understands governance and risk.

And there's a lot of different folks out there, that, that can help with that. But I think that's key too, is, is just business leaders really getting involved in defining, understanding and defining what should be allowed in the organization,

Final Thoughts

Dave: Cool. Are there, are there any last words of advice that we would like to give to you know, managers, business owners in terms of using these recording software and AI transcription tools?

Lily: I mean, like Jason mentioned, you know, have a policy in place even from a cost savings point of view, if you're going to pay for an enterprise version of some software, make sure all your employees know about it and are using that rather than using company funds to develop or work with competing software.

And then also, you know, there are a lot of different settings in many of the default platforms that allow you to give notice that allow you to use waiting rooms and allow you to develop that functionality that limits data leakage.

So, again, just really get to know what you're using in the company.

Dave: Yeah. Not just from a usability standpoint. Cause you know, that's what I'm always looking at. Like, Oh my gosh, how easy is it to use this thing? But there's things that need to be looked at behind the scenes.

Jason: But on the flip side, I would also add, don't be too scared of technology not to adopt it, right?

I mean, it's happening, whether we like it or not, like we're entering this new AI world and don't be so scared of it that you're not going to.

You leverage the technology that's out there because your competitors will, and you'll be left behind.

In fact, you need to be very forward thinking. You need to be paying attention to what's out there. What, you know, if chat GPT caught you off guard, and you were the last person in your family to learn about it, chances are, you're not doing enough to stay in, in front of technology, right?

And that, that it's just one of those things where, you know, you want it, you want to know what's going on, you not want to know what's out there and really pay attention because, you know, things are happening very quickly, but always have that mindset of security, privacy, governance, and be working with the right people, to make sure that, that you're making good decisions.

Dave: Yeah.

I don't think any, any of us here are saying that we should not use these AI transcription services.

It's more along the lines of we have to be aware of how it works and what are the security and legal ramifications we need to think through before we hit that record button.

Very cool.

Outro

So thank you, Jason and Lily.

For those who would like to get in touch with them, I'm going to leave their contact information and the website below in the transcript.

Thanks everyone for listening and spending time with us.

There's more great educational content and interviews for you at do what. works slash podcasts.

 

Jason Makevich, CEO, Greenlight Information Services

Linkedin: linkedin.com/in/jmakevich

Website: greenlight-is.com

 

Lily Li, President, Metaverse Law Corporation

Linkedin: linkedin.com/in/yuanjunlily/

Website: Metaverselaw.com

 

David Lee

David Lee

Leveraging 20+ years of experience with Fortune 500 companies including Toyota, Beckman Coulter, and Deloitte, I help companies choose the right marketing strategies with data, systems and processes.

LinkedIn