The Encrypted Economy

Is Encryption Becoming More Important to the U.S. Gov? Paul Lekas, Head of Global Public Policy, Software & Information Industry Association (SIIA) - E92

Eric Hess Season 1 Episode 92

On this week’s episode of The Encrypted Economy, our guest is Paul Lekas, Head of Global Public Policy at the Software & Information Industry Association (SSIA). We discussed current legislative efforts prioritizing privacy and privacy-enhancing technologies, as well as the implications of sanctioning code versus individuals. Be sure to subscribe to The Encrypted Economy for more insight on the adoption of privacy-enhancing tech and data security. 

Topics Covered:         
·  0:00   Introduction
·  5:00   How has the View of Encryption Shifted in the Past Decade?
·  8:50   The Good and Bad of Encryption Software
·  12:00   How the SSIA Deals With Regulating Code
·  20:00   What Agencies are you Seeing the Most Coordination?
·  21:40   Has Legislation Weakened Encryption?
·  28:40   Building Effective Privacy Legislation
·  39:00   Should Corporate Decision Making Be Culturally Driven?
·  43:30   Discussing the Definition of Commercial Surveillance
·  48:00   Framing Privacy as a Bipartisan Issue
·  57:30   Intersection of Technological Solutions and Social Goals   
Resource List:
·      Paul’s LinkedIn
·      SSIA Twitter
·      SSIA
·      FTC
·      Notice of Advanced Rule Making Around Commercial Surveillance
·      Government on Encryption
·      Tornado Cash
·      EARN IT Act
·      America Data Protection Privacy Act
Follow The Encrypted Economy on your favorite platforms!
Twitter
LinkedIn
Instagram
Facebook

 


Eric: [00:00:00] The software and Information Industry Association is the principal trade association dedicated to entertainment, consumer and business software industries. Its membership represents over 800 worldwide, large and small technology data and media companies. It provides opportunities for members to interact, sharing, and for members to interact share information is also importantly for this podcast, very influential in representing their membership's interests in DC on areas of things like intellectual property protection, encryption, online privacy, electronic commerce, taxation and educational technology.

Paul Lekas is its head of Global Public Policy, and he is right in the middle of it working in DC actively on the hill with regulators working globally across its membership. And he really has his finger on the pulse of what's impacting the interests of the software industry. We had 'em on today to talk about current legislative efforts or leading to privacy,

privacy enhancing [00:01:00] technology, or PETs, pets, as well as the implications of sanctioning code versus individuals. One of the points that we talked about at the end, which was quite interesting and. Didn't, I wanted to note it just so it doesn't hang out at the end was the FTCs notice of Advanced Rule Making around commercial surveillance which could have profound implications related to privacy federal privacy laws in the United States for businesses not for government.

By the way. Note that and it's very much happening right now currently, and I guess because of all the news out there FOMC tightening and inflation and UK Ukraine, what have, it really hasn't gotten as much public attention as it probably merits. But obviously there's a lot of things that we talk about in this podcast that merits that attention and which is why we did this podcast with him today.

Also note that if I use the terms like good or bad in this podcast, I'm doing so to frame a talking point, not because I actually believe that it could be so simplistically labeled. Anyway, as always, if you enjoy this podcast, share it. [00:02:00] If you have things you wanna hear about the podcast, reach out to me.

Definitely imminently reachable. Go to theencryptedeconomy.com. You can send me a note eric@hesslegalcounsel.com. But if you like it, share it. Thanks so much for listening regularly, and I hope you enjoy this episode with Paul Lekas. 

Welcome to The Encrypted Economy, a weekly podcast featuring discussions exploring the business laws, regulation, security, and technologies relating to digital assets and data.

I am Eric Hess, founder of Hess Legal Counsel. I've spent decades representing regulated exchanges, broker dealers, investment advisors, and all matter of FinTech companies for all things touching electronic trading with a focus on new and developing technologies. 

Excited to have Paul Lekas the Head of Global Public Policy at the Software Information Industry Association on the podcast.

Welcome Paul. 

Paul: Thank you. It's a pleasure to be here. 

Eric: Yeah. So why don't we start off with a little bit of your background leading up to this role, and we'll take it from [00:03:00] there. 

Paul: Great. So, I'm a lawyer by trade. I practice for a while in New York and in Washington, DC and I was a litigator.

I dealt with investigations. I did a lot in the financial space and some other areas. I joined the Department of Defense during the Obama administration, and I served as Deputy general Counsel there. Following that, I worked on the National Commission on Military, National and Public Service, which was an effort to, it was created by Congress.

It was an effort to try to figure out how we can get more Americans to serve their nation in different capacities. Thereafter I worked at the National Security Commission on Artificial Intelligence, where I led work on foreign policy and also worked pretty extensively in privacy and civil liberties.

And that led me more to a tech technology policy trajectory. And when I left government, I joined S I am to head of Public [00:04:00] Policy and Government Affairs, Activity. SIA is an association of more than 450 companies across the information landscape. So that includes financial information, education, technology, publishers, platforms, software developers’ data analytics firms and startups.

So, we essentially work on information and data policy in the United States and other countries. And at the state level, privacy and data protection are at the top of that list. Artificial intelligence is important and there's a number of other topics that we dig into. Yeah 

We 

Eric: could do multiple podcasts, but we'll try to stay focused.

One of the things you touched on was your role in the Obama administration. Yeah. Just curious as we start to talk about encryption, often it said it, Joe Biden as a senator was one of the biggest proponents of making encryption a defense secret and [00:05:00] making it illegal to export it.

Like he effectively started the crypto wars Now. In your time with the Obama administration and your time at the s I am, do you think that, how do you think that view has either matured or not matured over the years? I know there's a lot of other policy questions, but how do you see it from an administration perspective?

And this is not a political, just a directional question. Like how do you think that view on encryption has matured? Cuz at the time it certainly seemed like it was very distrustful. It would seem as being, potentially very harmful if it would, widely disseminated and exported.

Interested in your thoughts. 

Paul: Yeah, 

no, that's a great question. I think that the view on encryption at the government level has matured over the past decade. I recall towards the end of the Obama administration there was, there were intense discussions about back doors for law enforcement, for example. [00:06:00] 

And that was an issue that was pretty fraught and there were differences of opinion within the administration also more broadly when you bring in civil society and the private sector. And I feel like that debate has moved. There's much less attention now to or focus really on whether law enforcement should have back doors access to devices.

And in instead there seems to be a growing understanding that encrypt. Serves many important social purposes. And the focus is really on other ways in which from the law enforcement angle information can be obtained. And in the private sector really expanding the use of encryption in a number of different contexts.

And we can talk much more about all of that. In terms of the export control side, I think, export [00:07:00] controls are really challenging. There are a number of different ways in which you can. Apply export controls to achieve different policy goals often in national security. Export controls on hardware are probably the easiest to implement and to enforce.

It gets much more challenging when you look at export controls around data or export controls around code or algorithms. And I know this issue has been recently in the news with the like the tornado matter and the sanctions which blades somewhat to the export control approach. And I think that when you're talking about code and specifically open-source code, there are strong first amendment arguments to make.

That, that information is First Amendment protected speech, and that is not really the appropriate [00:08:00] place to apply export controls. And I also think it's very difficult to enforce those types of things. What we've seen in the current administration on the export control front is primarily a focus on the hardware the semiconductors, the equipment required to generate semiconductors that are critical for developing advanced computing capabilities.

Eric: And so, staying on that thought for a little bit in terms of open-source software do you think that there is a developing view, or do you think that, Again, I'm just looking for the, where the wind is blowing. Paul's I can't believe you're taking me right down the rabbit hole on this one right out the gate.

At least let me get warmed up for God's sakes. But Do you think that there is a distinction to that in some cases encryption software itself is good [00:09:00] because it has a good use and in some cases the encryption software itself is bad because it has a quote unquote bad or a, we'll just label it for the sake of convenience, a bad use and not saying, and I'm making a distinction between the code and the conduct or the people who are utilizing the code.

And how do you think that play that is playing out right now? 

Paul: Yeah, so that is the key distinction. It's between the technology itself, the software code, and then how it's used, the conduct, the activity the potential harm that can result, and also the benefits it can result. And so, I think that, as an association what we do is we try to.

Advocate for policies that are technology neutral. For the most part because the thing about technology, including code and software is that is not inherently good or bad. And it [00:10:00] really depends on how the technology is used. I think at the policy level, you're seeing something very similar to there are a number of initiatives that this administration has launched that actually promote the use of encryption.

Particularly on the foreign policy side. Encryption is way to protect free speech and particularly in autocratic nations to promote human rights. To allow individuals to communicate with one another without the risk of government interference. And, at the same time, encryption can also be a way too mass illicit activity, and that is problematic.

So, what I think you're seeing now is a focus with respect to the latter the bad conduct. A focus on ways that the government can [00:11:00] actually enforce better without having to create back doors. And I think this is reflected in series of reports that came out I believe it was last week from various agencies relating to the digital assets executive order that the president had issued earlier this year where there's really a focus on how do we cooperate with foreign law enforcement agencies?

How do we better track illicit activities. What sort of obligations can we put on banks and other financial intermediaries to detect potential money laundering, human trafficking and other illicit conduct without really going to the technology itself? 

Eric: And I, in, in sticking with tornado cash for a little bit, not to make it a tornado cash focus.

Sure. But in that context, there was it was a smart contract that was sanctioned, it was not an individual. And 10 historically, although [00:12:00] not exclusively that sanctioning has been used for individuals or regimes. In one case, I think there was a plane other cases there was a yacht just I guess because it was just, it was an associated with a bad actor.

But this was unique in that it was specifically focused on sanctioning code. And thus raised all the types of issues. As whom interacted with code because you could interact with an individual, but a code, it's, there's many different iterations. A code could be a lot more effective at interacting than even an individual, like as opposed one to one.

And so, do you I, I guess question number one is how do you see your organization contemplating that kind of thing? And I'm not looking for you to stay for the whole organization and make new policy. You get a phone call at the end of this saying, What the hell did you say that word? But how do you see that in terms of just in, in the consistency with the siia position on [00:13:00] it.

And the use of the code and the, the competing concerns of the government to try to achieve certain ends. But also try to, respect or potentially not respect in certain cases the notion of technology neutrality and not actually, attacking, like your point free speech.

Paul: Yeah, that's a great question. Let me first speak to the general issue, which I found it interesting that Treasury issued a clarification after they initiated the action against Tornado O Cash and clarified that with respect to the open-source code. It would not be a violation of sanctions laws to interact with the open-source code itself in a manner that doesn't violate the other requirements.

So, in a manner that doesn't involve a prohibited transaction. And [00:14:00] I think that's really important because it reflects potentially a recognition that open-source code is something that should be protected and can be used for many other purposes. Just like any other speech that is out there that is protected by our First Amendment.

With respect to SIIA it's interesting because crypto's not area that we have dug into. And that's because we are a member driven organization and our members have not directed us to really focus on crypto as an area yet. It is an area that we keep tabs on because it's becoming an increasingly important part of our economy.

And something that we do have pretty established experience in is First Amendment. And the First Amendment arguments around code and specifically open-source code are fairly strong. And those are ones that we would definitely [00:15:00] endorse and we feel comfortable coming out and speaking to. I don't think that we're going to issue any formal statements about this particular action just because it's in the crypto space.

But it is important to emphasize that. That this is First Amendment protected speech. And that's something that is critical in other contexts to our members' business in lots of different area. 

Excellent. We'll, maybe we'll shift gears a little bit and talk a little bit about privacy enhancing technologies.

Sure. In the u in the US government's role in that, where are the, can you help define where the US government is embracing Yeah. That and where the fears are in that embrace? 

I think what we're seeing in the US government is a very strong embrace of PRI privacy enhancing technologies.

And it's a bit of reading t leaves, but my sense is that the US [00:16:00] sees PTs as a way to achieve various goals that they've set out in the technology space. And that is largely on an international front, but also domestically. I think there is a growing sense, and you see this coming out of fin Send.

Financial Crimes Enforcement Network and also the National Institute of Standards and Technology and a number of other agencies that are working together that pets can actually pets as a category include tools that can help to detect the sorts of things that we're talking about.

And I think that's a big reason why the encryption debate has changed. Where in the past it was it was really a question of individual. Privacy versus certain social or public interests specifically around law enforcement. Now we're actually seeing encryption [00:17:00] as being an enabler.

And the privacy enhancing technologies that incorporate encryption such as homomorphic encryption are ways to actually do a much better job of trying to identify potentially illicit conduct. The other strand that we're seeing come out is this idea that PTs can help to enable cross-border engagement and data flows in areas where that has become more challenging.

And there's a number of trends that are underlying this, and then I'll get a little bit more to what we're seeing specifically in terms of government action. One of them is internationally there is a growing trend towards data localization. And on the digital side and. Trying to keep data within a particular geographic jurisdiction and not cross borders.

And PTs are a way to be able [00:18:00] to make productive use of data when you're not actually able to view all of the content. In that data and using technology to generate results that can actually lead to actionable outcomes. There is certainly a lot of interest in the government, in, in making use of pets in different contexts.

I mentioned that there are many agencies involved. There's a inter agency working group that is as representatives from a number of different agencies that is looking at ways to advance the adoption of pets and applications for pets. Our organization has submitted comments and had helped meetings with that inter agency group.

And I think that it is drawing a lot of attention. The interesting thing is going to be how is the government actually going to promote. Pets in a concrete way. We [00:19:00] think there's a lot of room to do that on the fin send front. There's also room to incorporate pets much more into some of the international dialogues that are going on.

We've seen recently that the United States and the European Union are looking to pilot some projects around pets. The United States and the United Kingdom have already launched a pet’s competition to try to develop innovative uses and applications of pets in different contexts that have socially beneficial uses.

I think one of the one of the important things to, to convey in this is that there are a number of pets that are quiet. And so, there's a lot of attention being paid right now to the research and the development of PETs. But there are pets that are on the market. There are pets that are readily deployable.

And what [00:20:00] I hope we will see over the next 12 months is more attention to how we can actually get the pets out there to achieve some of the objectives that people have laid out. 

Eric: Excellent. So, you think Sen is probably one where you'd see the most amount of coordination, possibly even cross jurisdiction.

What are some of the other ones? What other agencies do you think are, 

Paul: So, SEN has been looking at this for some time a couple of years. Probably more than that. And I think that is a very promising area because there's a lot of applications in the financial context. Healthcare is another there's a lot of interest to how pets can enable more scientific research and social science research.

And I think that will be a priority of the government given the involvement of the National Science Foundations and n in these efforts. I think they really wanna look at [00:21:00] how they can incorporate pets to do better research around healthcare outcomes and issues where the data.

That is relevant to the research is subject to heightened privacy protections. Because pets can enable true anonymization of data in a way that can generate useful insights in, in, in ways that I don't think we’re really feasible from a technological point of view even five, 10 years ago.

Eric: Excellent. And let's shift our shift gears to legislation that might potentially weaken encryption things. Ostensibly have a social good and child pornography, how could you argue with that? There's various [00:22:00] bills that would have derivative impacts or knock-on effects, right?

Eliminating abusive and rampant neglective Interactive Technologies Act or turn it. How do you see those progressing? Has the dialogue changed? Is it more mature? Is it becoming hard to compete with various desires to control that conduct? 

Paul: Yeah, that's a great question.

The EARN Act is one that generated a lot of attention earlier this year. It's been around for a few years in different forms. The Earn Act is geared towards a really important aim, which is to reduce the amount of the access to the conduct related to child sexual abuse material or csam.

And I don't think anybody can really object to that [00:23:00] object to that objective. However, this, the Earn It Act also contained some language that would say that the use of encryption end to encrypted messaging services in other forms of encryption would be. Could serve as a basis for liability on an interactive computer service that was hosting most likely inadvertently child sexual abuse material.

And that generated a coalition that included civil society organizations and the private sector really across the political landscape. Concerned that this would have a chilling effect not only on speech, but also on the ability to use advanced technology and software that can further enhance privacy in ways that are socially [00:24:00] beneficial.

Earn act was, is interesting because it really encapsulates the. The challenge that we have when we're talking about the balance between, the public good and individual privacy and individual freedom and I, it did not strike a good balance. The EARN act is not right now slated to proceed in any constructive form.

But I think that this is a debate that we're gonna be continuing to have in the ensuing years. 

Eric: And being on the Hill, obviously you see the different camps developing. Like presumably there may be politicians who just say, child pornography bad and Earn ACT tries to address it.

And that's, I don't want to be seen as supporting any as not supporting something like that. Or do you see are there [00:25:00] still the strong camps. or do you see more of a, is there more maturity in how people are addressing it? In other words, understanding that even if we, even if there is a social good, there is a larger fabric for encouraging technology and facilitating and not, restricting free speech.

Yeah. Where is the, I guess the question I'm asking is there's always a pendulum. And the pendulum swings in different directions and sometimes it strains out at a certain point and then you move on to the next iteration. Where are we in that pendulum? 

Paul: Is so I think that the pendulum has shifted a bit, and this is reflected in the law enforcement debate about encryption.

And I think you can see it in the earn debate. And one of the big changes. Is that there is, there are more technological solutions today. So, there are technologies out there and applications, and I think the advent [00:26:00] of AI and machine learning has been critical to this that have improved the capabilities of internet entities to actually detect this kind of harmful content.

Content moderation, despite all of the discussion that we have around disinformation and misinformation, which we can go into for sure. But content moderation practices have evolved and are much more mature than they were because we have technologies today that are much better at trying to detect harmful speech. And that includes child sexual abuse material, which it is already illegal to promote that kind and to host that kind of information. So I think you, what you're seeing is, a response that is, that goes across the aisle that goes across the non-governmental world [00:27:00] to include silk society organizations on the left and the right, and companies that are big companies and small companies that want to look at ways that we can achieve the same ends without these sorts of regulations that would impose potential liability.

Now I think that the pendulum I think is adjusting right now because I don't know how well on the hill there's awareness of how sophisticated some of the technology is. Yet there is, has been an effort over the past two to three years to really improve the understanding of technology among members of Congress bringing in new staff that has expertise in these areas really listening to the outside the outside world, the civil society, and the private sector [00:28:00] about what can be done in this area.

So, I think that we're gonna continue to see further debate here. And it goes directly into a number of other debates that are really ripe and it's sometimes heated right now around beyond cm around content generally, and what type of content should be permitted and what sort of authorities the internet providers should have to regulate the sort of content that appears on their media versus what sort of restrictions the government wants to impose.

Eric: And I guess on that point, let's talk a little bit about privacy by design. Yeah. Reg, laws and regulations and building encryption into that where it could be mandated or strongly encouraged and where it may be [00:29:00] strongly 

discouraged. 

Paul: Yeah, it's a great question.

So, the privacy debate is multifaceted. The at the United States Federal level, there has been an effort to create a federal privacy law for many years. This year, over the past six months, we've seen a renewed effort, which is probably the most robust feasible effort to create a federal privacy law that we've seen in quite some time.

There is a penny bill the American Data Protection and Privacy Act. I may have gotten the name right. AD PPA is the acronym. That has received bipartisan support so far which is really critical because privacy often raises concerns that split the Democrats and the Republicans. But [00:30:00] thus far it has received bipartisan support.

It came out of Committee in the House, the Energy and Commerce Committee with the vote of 56 to two. It is currently pending. There is a chance that it will be taken up after midterms. If not, it will probably be the template for a privacy bill that is discussed in the next Congress, which, which convenes in January.

And the ad PPA contains provisions that would require. Data holders to minimize data. So, a duty to a duty of data minimization. It also contains provisions requiring privacy by design. Now these are concepts that are not completely fleshed out in the bill but they do open the door to incorporating more advanced technological solutions to try to minimize the amount of data that is available, that is taken, that is collected and used in order to protect privacy and also to build [00:31:00] privacy into various systems that end up collecting user data and using and trans transforming user data into other uses and purposes.

So, I think there's a real opportunity here to incorporate some of the advanced technologies we've seen including in the p e t space in order to try to minimize the amount of data that is collected and also in order to be able to make productive use of data that is collected without revealing privacy protected material.

And I think that's really critical. As an association, one of the things that we. Try to convey and support is that we have members that are working on a whole host hundreds of different projects that are actually really beneficial for society in ways that we might not [00:32:00] even realize.

That's in the education context. That has to do with the financial systems the financial services. It has to do with analyzing data. It has to do with helping the government and law enforcement, other capacities. It has to do with educating the public about different things and the risk with a privacy bill is if it is not crafted well, it could restrict those productive uses yet.

If the drafting is good, and those companies are able to make use of some of these technologies, they can still provide those uses to public and private and individual public and private entities and individuals without increasing the risk of revealing or using information that is very personal to people that people want not to share more broadly.

So, I think there's an opportunity here and there's an [00:33:00] opportunity really to talk about technological solutions to privacy that have been long contemplated by the law but have not been feasible or truly realistic until the current wave of technological advancement. That part of the conversation hasn't happened so much.

But I think that it is a conversation that that should be developed. Particularly as we're looking at what does privacy by design mean and how can we design not just through laws and policies and procedures but through technology and how can we actively minimize the amount of data that is needed and used while still retaining productive uses of data and socially beneficial uses of data without revealing information that we don't want to reveal.

I will say there's a lot of work being done in those [00:34:00] spaces behind the scenes, but in terms of the public debate that hasn't really been in the front. 

Eric: And how does that integrate with sort of FTC regulation of things like commercial surveillance, which is, I guess privacy by design and commercial surveillance.

It's a quote unquote commercial surveillance FTC regulation. Maybe you could expand a little bit on that and how the definition or interpretation of commercial surveillance feeds into a privacy by design debate. Yeah, 

Paul: no, it's a really good question. The FTC has been fairly active.

The FTC has issued a request that more than a request for information and advanced notice of proposed rulemaking, they're seeking input from the public relating to commercial surveillance and data security in order to potentially advance a formal rule making and [00:35:00] ultimately create rules around privacy.

And data security in the absence of federal law that governs privacy and data security in a holistic manner. Now that process may be put on hold if legislation in Congress does move forward. Legislation that would give the FTC specific authorities to develop rulemaking in certain spaces. There are lots of questions about whether the FTC has the requisite authority to do what it wants to do here.

That aside the FTC is the advanced notice of proposed rulemaking. The A and PR that they issued is quite expansive. It includes 95 questions to the public. Some of them relate specifically to privacy, some relate to other things. And it is all within this rubric of commercial surveillance.

I, commercial surveillance is a term that they [00:36:00] define fairly broadly to mean essentially any gathering of information. And use of information by companies. So, it applies largely in the digital space. That's where you see information collected at scale. And I do take I do take exception to the use of that term commercial surveillance because I think surveillance has a particular use in the intelligence context and in the law enforcement context, which is different than how information is gathered and used.

For the most part predominantly in, in, by the private sector. So, I think it, it has a connotation that is negative when there are really a lot of ways in which information is gathered and used in socially beneficial ways and used in ways that benefit individuals [00:37:00] and consumers. 

Eric: So actually, double clicking on that this, the term commercial surveillance, when did it first emerge?

That's a great question in the 

FTCs jargon not just as a; I don't expect you to identify in time. Yeah. 

Paul: So, the use of the term commercial surveillance has emerged generally in common parlance fairly recently. Definitely within the past decade within the ftc, it's really been under the current leadership of chair com that the term has been used.

And I think it's part of an effort to expand the FTCs aperture and to fill a gap that exists. A, there is a gap that exists right now that we don't have a federal privacy law, and everybody wants a federal [00:38:00] privacy law. It would serve so many benefits. But undertaking rule making around privacy and data collection practices using that terminology is very different than undertaking rule making around commercial surveillance.

Because it implies a negative were. I just don't think you can paint it with that broad of a brush. That's, that I think that there could be value in the FDC explicating, some of its current authorities as they may apply in the data collection use space. And we've seen actions recently where they've gone after bad actors who have misused data and have used data beyond the permissions that have been granted by individuals.

And that's something that FTC currently has authority to do. And it [00:39:00] may be worthwhile to clarify what those authorities are, but I think there is a little bit of concern when you're framing it around surveillance. 

Eric: Recently, members of Congress have, expressed their displeasure to heads of large companies for trying to enforce

social values or making corporate decisions based On, yeah, on non-economic, but, so almost like a cultural perspective. And there's been a warning. And again, I'd have to search back, but do you see that as in any way touching upon these issues? Or do you see them as just being so distinct that that's not where an overlap would ever be?

Paul: So that's very interesting. The, that goes into the current debates around content moderation and. There is the content. Action is a fascinating issue because it [00:40:00] creates unusual coalitions of individuals who ordinarily wouldn't be aligned politically on issues. So, we're definitely seeing an outcry, I think from certain politicians on the Republican side or the conservative side about social media platforms censoring the speech of conservative outlets and conservative individuals.

And we also see some criticism from the liberals and the progressives but not, it's not the same concern. It has a different target. And I think in, on the left side, it is often having to do with restricting the power of some of these larger companies that they may 

Eric: have.

But I actually think it actually even goes a step further in that it goes to the company's own [00:41:00] internal culture in certain cases again I, concern about certain companies pushing a narrative internally and forcing that versus others as well. And I don't mean to say that's the issue I'm asking about, I'm just saying more broadly it seems like it's permeated beyond just the social media content moderation.

Paul: Yeah. Yeah. No, I think you're right about that. And the. This goes to some of the free speech issues that we talked about earlier and the ability of companies to moderate. I speak to content moderation, but I understand that it goes beyond that based on their own internal policies and what they think is appropriate or not.

And there is an ability for them to do just like there's an ability for a newspaper to publish whatever it wants. And on the content moderation side, we're [00:42:00] seeing a number of efforts. There's a bill that just came out of the Senate Judiciary Committee yesterday that would be depending on who you ask, has different purposes, but essentially requires certain large internet platforms to negotiate with a group of publishers to provide them with better pricing and placement of their items on the different platforms that they host.

And those on the conservative side highlight this as a way to attack what they perceive as censorship by these large social media companies. And those on the left say this is a way to reign in the abuses and the power of a large these large platforms. Yet the bill would also essentially require.

These platforms, these internet platforms that we all use every day to carry content [00:43:00] that violates their own terms, which may include hateful and abusive speech that may be unsafe and harmful to its users. And so really comes right up against the constitutional protection of free speech and the ability of these platforms to moderate the content on their platforms.

Eric: And so, tying that back to commercial surveillance is sure is the underlying definition of commercial surveillance, purely one that's driven by a pecuniary interest? Meaning, your data mining Yeah. For the purposes of anticipating, need or marketing more aggressive marketing tactics.

Whereas the definition of commercial surveillance encompassing enough to even capture things like social content moderation or 

Paul: the definition of social of commercial surveillance is it's laid out right now. In the document the FTC issued is very broad. It covers all of that. So, it just, 

Eric: so, it's not truly commercial [00:44:00] then, it’s in commerce, there's surveillance and commerce, but not really surveillance, if I recall 

Paul: correctly.

They do use the word consumer in the definition, but consumer can be a consumer of information and consumer who purchases something. There's, it definitely covers practices where an individual is not buying. Because there is a swath of the internet where there are free services, and those services are paid for by targeted advertising.

And it's, we could be in a situation if these sorts of measures move forward, where a, individuals need to hone out money to get the content that they have enjoyed for so many years for free. Which I think is a situation that many individuals would not be happy about. I personally would not like to the amount of content I consume every day would not have to pay for all of [00:45:00] that.

That would put a big dent in my own personal budget. And I'm sure that many people have grown accustomed to enjoying free content, even if it means seeing an ad on the side of your webpage. 

Eric: And this, the 95 questions associated with. Commercial surveillance. A lot of times these questions, you can read the questions and there's a lead, meaning, it's like, it's not, it's not always completely unbiased.

You can actually tell the direction that they're going. Where do you think that the direction the FTC is looking to take commercial surveillance? Yeah. Or suggesting taking it? 

Paul: It, that's a really good question because I don't wanna get too far ahead of myself, but typically when you see a document like this, an advanced notice of proposed rulemaking, it is it is designed around a.

the contours of a rule. But an a [00:46:00] NPR with 95 questions, it doesn't provide a lot of clarity about what direction they're looking to take. My sense is that there's an interest in creating a regulation that basically governed data privacy in the United States. And it would be crafted in such a way that the FTC believes it, it complies with what authority they have and they're looking to identify which areas they really need to focus on.

I don't know if any such rule, draft rule is being prepared. I would expect having worked in government that people are working on that right now. I don't think it will touch on every question. I imagine that they. They will consider carefully the comments that they receive and see where people are expressing concern about practices that happen and where people are expressing concern about the direction that the FTC is taking it in.

And there will probably be some sort of an assessment to [00:47:00] focus it much more than it is right now. I would imagine that the main focus would be on data practices that the FTC considers to be unfair. Which is a theme that we've seen come out of the ftc even in the past two weeks with chair con give a speech a couple of weeks ago and recently Commissioner Pado gave a speech where they're talking about how.

Talking about focusing their efforts on fairness and what does it mean to be fair? And so, I think that along the lines of fairness and privacy, they will develop a proposal around what is fair commercial surveillance and where do companies need to take more care.

But I think we're a way off of that. The there's going to be a lot of activity over the. Four [00:48:00] weeks, I think, in terms of organizations and individuals developing input to share with the ftc. The comment deadline is October 21st. And I would expect there will be some listening sessions after that and maybe a time when they wait to see if Congress is actually gonna move forward with privacy bill that would direct the FTC to make rules in particular areas.

And in absence of that, this is something that will probably be teed up again early next 

Eric: year. Great. So, I wanted to shift gears to something that is politically charged but try to talk about it in terms that are not Sure. Specifically do. And not so much on a perspective of where anybody sits on Dobbs, but more specifically there's been a concern increasingly raised that, with Dobbs and state level efforts to [00:49:00] with regards to abortion and potentially criminalizing or penalizing or what have you that the information of people who travel across straight lines or otherwise go to reproductive centers or what have you, or clinics that, that could expose them, that could violate their privacy rates.

And I'm, so there's been a lot of concern about that. Particularly, I would say on the more, on the Democrat or the more left side of the aisle. But to take a big step back from that, Frame it for me in a way that it's more bipartisan.

Like how a practitioner in this space does particularly, addressing these issues, I imagine like the last thing you really want to do is start lobbying across the hill for encryption or privacy and talking about the benefits of it, [00:50:00] and then come into somebody's office and just lead off with something like that.

It's particularly if you're talking to somebody who's centrist or on the Republican, it's how do you, how does this frame beyond being a purely partisan issue, or is, or does this actually create more problems in terms of trying to get sort of consensus on, on federal regulation and privacy?

Paul: Yeah, that, 

that's a great question. I think that I think with all of these issues this is an approach that I take, this is an approach that my association takes as well, is we wanna be bipartisan wherever possible. And we don't want to take. Sides of the Democrats or the Republicans, the liberals or the conservatives, even if some issues tend more naturally to one side or to the other.

I think that if you're going to [00:51:00] And this is getting a little ahead of myself cuz we haven't engaged on this particular topic. But the way to frame this bipartisan in a bipartisan manner is really around privacy. And you can conceive of situations where those aligned with. With the conservative viewpoint that may be concerned about their information being available or able to be collected and sold and disseminated in other contexts.

And I think that the importance of maintaining user privacy while also allowing companies to make use of data that they, that is available to them in the public or that they have lawfully obtained from other companies can lead to some sort of a compromise. What you're seeing in Dobbs is the [00:52:00] states are certain states are taking action to try to protect inform.

That could lead to potential harms to individuals. And you're also seeing some of the companies that have access to the information that is relevant here, taking own internal steps most likely that is what we're going to continue to see. We're probably not going to see Congress in the short term at least enact any legislation that would address the situation that we are we're seeing.

But I think if you do want to generate a compromise, privacy is the first place to go. Privacy is an issue that conceptually has a lot of support among Democrats and Republicans, and you've seen that in the current debate around the ad PPA that has generated support from prominent Republicans and prominent Democrats.

Democrats and Republicans want a federal privacy bill that protects individuals and individuals’ data. [00:53:00] So that's the way I would approach it. Is that if somebody is if you're speaking with a politician who is not opposed to the ability of private groups to get access to information and do whatever they may do with that information it is framing it firstly around the concept and then developing the concrete scenarios where this may lead to bad results.

The status quo may be led to bad results for the sorts of issues and constituents that individual represents. 

Eric: I'm gonna go a little off on this one. I like to explore. So, the, let's frame it a little differently. Let's say that let's take a conservative cause like guns right to own guns.

You could arguably frame the ex -it, there's more of a potential health issue there, but, but I think you could potentially have framers on each side of the debate [00:54:00] saying that Hey, if I own a gun, I don't want to have I don't want the state to know because, it could be opposed to my gun ownership, they could come after me just on the pure fact that I own a gun.

And so oddly, you find potentially polarizing perspectives on it. Hey, if I'm die hard and I think we should criminalize abortion maybe I'm somebody and I don't, I'm referring to myself, I'm somebody who thinks that you should, that you should be able to get that information, that privacy rights shouldn't be protected.

Or on the flip side, if I'm, opposed to, to guns, I think for that particular reason, it's distinct from Dobbs, right? It's not the exact same. Why? No, this is a woman's rights and it's, and so it is very easy to so you can take those polar perspective.

But I would think also in interested in your thoughts does Dobbs bring [00:55:00] people into the camp in the same way that maybe guns rights or might bring people into the campus saying, I think you touched on it before, which is, it's not, If you're opposed to you want to read more restrictive of guns, you may not choose to go down the road of privacy because it touches too many other issues.

You may choose instead to say licensing or other controls, or if you're opposed to abortion, you may say, I don't want to go down the rights of down the whole of privacy, but I want to do other things that restrict it to penalize it. What have you, what's your Yeah.

Just Yeah. It's, 

Paul: that's probably the best example that we have out there right now of a polarizing issue that. Could resonate with politicians who are strongly pro-life and [00:56:00] and supportive Also of some of the efforts that we're seeing right now among the civil society groups and NGOs and individuals is really the gun issue.

I think there are distinguishing factors between the gun issue and abortion. And there, there aren't that many out there who are calling for a criminalization or make of guns or making guns illegal, but there are plenty who are coming who call for a ban on certain types of weapons.

And so, you could see a group mobilizing from that perspective and targeting individuals who have purchased Military style assault rifles. And in that case, I think that you would see, I think that kind of an example would [00:57:00] resonate. And that, the privacy interests are very similar there.

And people would want to, so long as they're doing something that is permitted, not be targeted for their own personal choices. 

Eric: Great. So, any other topics that we haven't touched on that you think are particularly hot that we should talk about before we, 

Paul: I think that this whole area is completely fascinating and it's just getting so much attention right now and I am very interested in the interplay of.

Technology, technological solutions with the social goals that we have around privacy and data protection and really elevating that conversation. And to go back to something we started with, I think that the encryption debate has changed in the past seven years or [00:58:00] so. I think there is much more support for encryption and an interest in looking at creative ways to by public authorities to try to address the concerns that they had previously raised pretty strongly with respect to encryption practices.

So, I think as the area that continues to evolve, we're gonna see much more attention in the space and I look forward to following it, engaging on it. 

Eric: Yeah, I think one thing that comes out is it's probably the best opportunity for ensuring that this doesn't get.

Politicized is to continue to try to encompass both sides of the aisle. Otherwise, if you say, hey, it's right here and it's wrong there, now all of a sudden, you're, you're gonna have a very difficult time bringing it back to some core central privacy themes.

Yeah. 

Paul: Yeah, that's, [00:59:00] that's a good point. We need to be bipartisan in what we're trying to pursue. And politics is quite partisan right now. Yeah. Every, everyone knows that. I don't need to say it, but I have this idealistic belief that we can find bipartisan solutions to a lot of challenges, and they may not be perfect.

But I think that. We need to be proactive in this space. As a nation, we are behind where we need to be. We are falling behind the European Union in terms of monitoring and regulating the digital world. We are letting other nations take the lead. There is that foreign policy or national security aspect to it.

We have a strong interest in being a model for other nations. We are we generate more [01:00:00] technological advances than any other nation. The innovation climate here is incredibly strong. Yet we're very behind when it comes to setting out societal expectations and guidelines. And I think there's a real opportunity to do more there.

And it's bipartisan. It is something that Democrats or Republicans can both latch onto. 

Eric: Excellent. Thanks so much for coming on the podcast. If people wanna learn more about what you do where can they find you? 

Paul: Great. It's been a pleasure to be here. Thank you, Eric, for inviting me.

You can reach me by email. Plekas@siia.net 

Eric: excellent. Do you do social media stuff as well or is I am particularly active? 

Paul: I'm on LinkedIn. I am passively on Twitter. I have an official Twitter account that I use, which is SIIA policy. And so, if you wanna follow us, that's the place to go. [01:01:00] 

Eric: Excellent. Thanks. Thanks again. It was great to have you on. 

Paul: Thank you, Eric.