Wij willen met u aan tafel zitten en in een openhartig gesprek uitvinden welke uitdagingen en vragen er bij u spelen om zo, gezamelijk, tot een beste oplossing te komen. Oftewel, hoe kan de techniek u ondersteunen in plaats van dat u de techniek moet ondersteunen.

Alex Stamos rose to fame as the former chief security officer for Yahoo and then Facebook. But today he’s the director of Stanford’s Internet Observatory, where he’s immersed in teaching and research safe tech — and understands better than most the threats that the U.S. is facing, particularly as we sail toward the next U.S. presidential election.

Last night, at a StrictlyVC event in San Francisco, he talked with New York Times cybersecurity correspondent Sheera Frenkel about a small number of these massively impactful issues, first by revisiting what happened during the 2016 president election, then catching up the audience on whether the country’s defenses have evolved since. (The short version: they haven’t. If there’s any good news at all, it’s that the federal and state governments are at least aware now there’s an issue, whereas they appeared largely blindsided by it the last time around.)

What worries Stamos most are “direct attacks on our election infrastructure” because there’s been so little to bolster it. In fact, a big theme of the interview was the growing inability of the public sector to protect Americans or U.S. democracy against actors who would do the country harm.

As it relates to election infrastructure specifically, Stamos used a hyperlocal example to underscore what the U.S. is dealing with right now. As he told Frenkel, “I live in San Mateo County. I’ve met the CIO of San Mateo County. Really nice guy. I’m sure he has a staff of very hard-working people. The idea that the CIO of San Mateo County has to stand up and protect himself against the [Russian military intelligence agency known as the] GRU or China’s Ministry of State Security or Iran’s Islamic Revolutionary Guard Corps or the Lazarus Group of North Korea . . . that’s frickin’ ridiculous. Like, we don’t ask the San Mateo County Sherriff’s department to get ready to repel an invasion by the People’s Liberation Army, but we ask for the cyber equivalent in the United States.”

Put into perspective, San Mateo County is one of about about 10,000 local governments in the United States that are involved in elections, said Stamos. “Nobody else in the world runs their elections this way.”

In fact, in nearly every conceivable way, “responsibilities that were once clearly public sector responsibilities are now private sector responsibilities,” he told Frenkel during a later part of their discussion. He would know, having seen it first-hand.

“When I was the chief security officer at Facebook,” he told the audience, “I had a child safety team. We probably put more bad guys away than almost any law enforcement agency outside of the FBI or [Homeland Security Investigations unit] in the child safety realm. Like, there’s no local police department in the United States that put away more child predators than the Facebook child safety team. That is a crazy stat.

Facebook also has a counter terrorism team — which not everyone realizes — and which has become in many ways the country’s first responder, he suggested. Indeed, Stamos said that “there are several terrorist attacks that you’ve never heard of because they didn’t happen because we caught them. Now, there’s some local law enforcement agency took credit for it, but it was actually our team that found it and turned it over to them with a bow on it.”

Americans might shrug off this continuing shift in who is tackling what, but they do it at their peril, suggested Stamos — who managed to keep the crowd laughing, even as he painted a bleak picture. As he noted, the big tech “companies are exercising this power without any kind of democratic oversight.” Consider, he said, that “[Facebook’s] authorization is the terms of service that people click through and never read when they join Facebook or Instagram. That’s a bizarre set of rules to be bound by when you have such incredible power.”

Another huge blind spot, said Stamos, is the apparently inability — as well as the collective lack of determination required — of the public and the increasingly powerful private sector to coordinate their work.  Here, he offered another broad example to make it accessible. “Say you had an organized group in the United States that’s running a bunch of Facebook ads, but their money is coming from bitcoin from St. Petersburg,” said Stamos. “That is completely invisible to Facebook. That is perhaps visible to FBI . . .but they don’t have access to that actual content [on FB]. And figuring out a way for these two groups to work with each other without massively violating the privacy of everybody on the platform turns out to be super hard.”

Yet it’s worse than even that sounds, he continued. The reason: there’s no decision-tree in part because the issue has grown so unmanageable that no one wants to own what goes awry. “There’s effectively nobody in charge of this right now, which is one of the scariest things we’re facing as a country. Almost nobody is in defense of cyber, and certainly nobody is in charge of the big picture, [meaning] how do we defend against election [interference] both from a cybersecurity perspective and a disinformation perspective.”

Stamos even jokingly referred to “pockets of people in the U.S. government who are effectively hiding from the White House and trying very, very hard” to escape its attention, given the daunting job they’d be tasked with figuring out. Except, all kidding aside, with no one at the helm and “no real cross-agency process, there’s really nobody in charge,” said Stamos.

That means the “tech companies are effectively the coordinating body for this. And that’s actually really screwed up.”


TechCrunch

Oof — a week after PayPal announced plans to part ways with Facebook’s Libra cryptocurrency project and the related association of the same name, three more names are reportedly breaking away: eBay, Stripe, and Mastercard. (Update: and now Visa!)

In a comment to TechCrunch, a Stripe spokesperson leaves the door open for them to potentially work with Libra in the future — but not right now:

“Stripe is supportive of projects that aim to make online commerce more accessible for people around the world. Libra has this potential. We will follow its progress closely and remain open to working with the Libra Association at a later stage.”

Word of eBay’s exit comes via Reuters, which quotes an eBay spokesperson as saying:

“We highly respect the vision of the Libra Association; however, eBay has made the decision to not move forward as a founding member”

Mastercard’s looming departure, meanwhile, just broke in the WSJ.

This is a fairly massive hit for the project, with three flagship partners all bailing simultaneously. It all happens just days after reports that regulatory pressure behind the scenes was causing a number of members to reconsider their support.

Update: Visa has now backed out as well, citing regulatory concerns directly:

Visa has decided not to join the Libra Association at this time. We will continue to evaluate and our ultimate decision will be determined by a number of factors, including the Association’s ability to fully satisfy all requisite regulatory expectations.

Story developing..


TechCrunch

Facebook CEO Mark Zuckerberg, a 35-year-old billionaire who keeps refusing to sit in front of international parliamentarians to answer questions about his ad business’ impact on democracy and human rights around the world, has a new piece of accountability theatre to sell you: An “Oversight Board“.

Not of Facebook’s business itself. Though you’d be forgiven for thinking that’s what Facebook’s blog post is trumpeting, with the grand claim that it’s “Establishing Structure and Governance for an Independent Oversight Board”.

Referred to during the seeding stage last year, when Zuckerberg gave select face-time to podcast and TV hosts he felt comfortable would spread his conceptual gospel with a straight face, as a sort of ‘Supreme Court of Facebook’, this supplementary content decision-making body has since been outfitted in the company’s customary (for difficult topics) bloodless ‘Facebookese’ (see also “inauthentic behavior”; its choice euphemism for fake activity on its platform)

The Oversight Board is intended to sit atop the daily grind of Facebook content moderation, which takes place behind closed doors and signed NDAs, where outsourced armies of contractors are paid to eyeball the running sewer of hate, abuse and violence so actual users don’t have to, as a more visible mechanism for resolving and thus (Facebook hopes) quelling speech-related disputes.

Facebook’s one-size-fits-all content moderation policy doesn’t and can’t. There’s no such thing as a 2.2BN+ “community” — as the company prefers to refer to its globe-spanning user-base. So quite how the massive diversity of Facebook users can be meaningfully represented by the views of a last resort case review body with as few as 11 members has not yet been made clear.

“When it is fully staffed, the board is likely to be forty members. The board will increase or decrease in size as appropriate,” Facebook writes vaguely this week.

Even if it were proposing one board member per market of operation (and it’s not) that would require a single individual to meaningfully represent the diverse views of an entire country. Which would be ludicrous, as well as risking the usual political divides from styming good faith effort.

It seems most likely Facebook will seek to ensure the initial make-up of the board reflects its corporate ideology — as a US company committed to upholding freedom of expression. (It’s clearly no accident the first three words in the Oversight Board’s charter are: “Freedom of expression”.)

Anything less US-focused might risk the charter’s other clearly stated introductory position — that “free expression is paramount”.

But where will that leave international markets which have suffered the worst kinds of individual and societal harms as a consequence of Facebook’s failure to moderate hate speech, dangerous disinformation and political violence, to name a few of the myriad content scandals that dog the company wherever it goes.

Facebook needs international markets for its business to turn a profit. But you sure wouldn’t know it from its distribution of resources. Not for nothing has the company been accused of digital colonialism.

The level of harm flowing from Facebook decisions to take down or leave up certain pieces of content can be excruciatingly high. Such as in Myanmar where its platform became a conduit for hate speech-fuelled ethnic violence towards the Rohingya people and other ethnic minorities.

It’s reputational-denting failures like Myanmar — which last year led the UN to dub Facebook’s platform “a beast” — that are motivating this latest self-regulation effort. Having made its customary claim that it will do a better job of decision-making in future, Facebook is now making a show of enlisting outsiders for help.

The wider problem is Facebook has scaled so big its business is faced with a steady pipeline of tricky, controversial and at times life-threatening content moderation decisions. Decisions it claims it’s not comfortable making as a private company. Though Facebook hasn’t expressed discomfort at monetizing all this stuff. (Even though its platform has literally been used to target ads at nazis.)

Facebook’s size is humanity’s problem but of course Facebook isn’t putting it like that. Instead — coming sometime in 2020 — the company will augment its moderation processes with a lottery-level chance of a final appeal via a case referral to the Oversight Board.

The level of additional oversight here will of course be exceptionally select. This is a last resort, cherry-picked appeal layer that will only touch a fantastically tiny proportion of the content choices Facebook moderators make every second of every day — and from which real world impacts ripple out and rain down. 

“We expect the board will only hear a small number of cases at first, but over time we hope it will expand its scope and potentially include more companies across the industry as well,” Zuckerberg writes this week, managing output expectations still many months ahead of the slated kick off — before shifting focus onto the ‘future hopes’ he’s always much more comfortable talking about. 

Case selection will be guided by Facebook’s business interests, meaning the push, even here, is still for scale of impact. Facebook says cases will be selected from a pool of complaints and referrals that “have the greatest potential to guide future decisions and policies”.

The company is also giving itself the power to leapfrog general submissions by sending expedited cases directly to the board to ask for a speedy opinion. So its content questions will be prioritized. 

Incredibly, Facebook is also trying to sell this self-styled “oversight” layer as independent from Facebook.

The Oversight Board’s overtly bureaucracy branding is pepped up in Facebook headline spin as “an Independent Oversight Board”. Although the adjective is curiously absent from other headings in Facebook’s already sprawling literature about the OB. Including the newly released charter which specifies the board’s authority, scope and procedures, and was published this week.

The nine-page document was accompanied by a letter from Zuckerberg in which he opines on “Facebook’s commitment to the Oversight Board”, as his header puts it — also dropping the word ‘independent’ in favor of slipping into a comfortable familiar case. Funny that.

The body text of Zuckerberg’s letter goes on to make several references to the board as “independent”; an “independent organization”; exercising “its independent judgement”. But here that’s essentially just Mark’s opinion.

The elephant in the room — which, if we continue the metaphor, is in the process of being dressed by Facebook in a fancy costume that attempts to make it look like, well, a board room table — is the supreme leader’s ongoing failure to submit himself and his decisions to any meaningful oversight.

Supreme leader is an accurate descriptor for Zuckerberg as Facebook CEO, given the share structure and voting rights he has afforded himself mean no one other than Zuckerberg can sack Zuckerberg. (Asked last year, during a podcast interview with recode’s Kara Swisher if he was going to fire himself, in light of myriad speech scandals on his platform, Zuckerberg laughed and then declined.)

It’s a corporate governance dictatorship that has allowed Facebook’s boy king to wield vast power around the world without any internal checks. Power without moral responsibility if you will.

Throughout Zuckerberg’s (now) 15-year apology tour turn as Facebook CEO neither the claims he’ll do things differently next time nor the cool expansionist ambition have wavered. He’s still at it of course; with a plan for a global digital currency (Libra), while bullishly colonizing literal hook-ups (Facebook Dating). Anything to keep the data and ad dollars flowing.

Recently Facebook also paid a $ 5BN FTC fine to avoid its senior executives having to face questions about their data governance and policy enforcement fuck-ups — leaving Zuckerberg & co free to get back to lucrative privacy-screwing business as usual. (To put the fine in context, Facebook’s 2018 full year revenue clocked in at $ 55.8BN.)

All of which is to say that an ‘independent’ Facebook-devised “Oversight Board” is just a high gloss sticking plaster to cover the lack of actual regulation — internal and external — of Zuckerberg’s empire.

It is also an attempt by Facebook to paper over its continued evasion of democratic accountability. To distract from the fact its ad platform is playing fast and loose with people’s rights and lives; reshaping democracies and communities while Facebook’s founder refuses to answer parliamentarians’ questions or account for scandal-hit business decisions. Privacy is never dead for Mark Zuckerberg.

Evasion is actually a little tame a term. How Facebook operates is far more actively hostile than that. Its platform is reshaping us without accountability or oversight, even as it ploughs profits into spinning and shape-shifting its business in a bid to prevent our democratically elected representatives from being able to reshape it.

Zuckerberg appropriating the language of civic oversight and jurisprudence for this “project”, as his letter calls the Oversight Board — committing to abide by the terms of a content decision-making review vehicle entirely of his own devising, whose Facebook-written charter stipulates it will “review and decide on content in accordance with Facebook’s content policies and values” — is hardly news. Even though Facebook is spinning at the very highest level to try to make it so.

What would constitute a newsworthy shock is Facebook’s CEO agreeing to take questions from the democratically elected representatives of the billions of users of his products who live outside the US.

Zuckerberg agreeing to meet with parliamentarians around the world so they can put to him questions and concerns on a rolling and regular basis would be a truly incredible news flash.

Instead it’s fiction. That’s not how the empire functions.

The Facebook CEO has instead ducked as much democratic scrutiny as a billionaire in charge of a historically unprecedented disinformation machine possibly can — submitting himself to an awkward question-dodging turn in Congress last year; and one fixed-format meeting of the EU parliament’s conference of presidents, initially set to take place behind closed doors (until MEPs protested), where he was heckled for failing to answer questions.

He has also, most recently, pressed US president Donald Trump’s flesh. We can only speculate on how that meeting of minds went. Power meet irresponsibility — or was it vice versa?

 

International parliamentarians trying on behalf of the vast majority of the world’s Facebook users to scrutinize Zuckerberg and hold his advertising business to democratic account have, meanwhile, been roundly snubbed.

Just this month Zuckerberg declined a third invitation to speak in front of the International Grand Committee on Disinformation which will convene in Dublin this November.

At a second meeting in Canada earlier this year Zuckerberg and COO Sheryl Sandberg both refused to appear — leading the Canadian parliament’s ethics committee to vote to subpoena the pair.

While, last year, the UK parliament got so frustrated with Facebook’s evasive behavior during a timely enquiry into online disinformation, which saw its questions fobbed off by a parade of Zuckerberg stand-ins armed with spin and misdirection, that a sort of intergovernmental alchemy occurred — and the International Grand Committee on Disinformation was formed in an eye-blink, bringing multiple parliaments together to apply democratic pressure to Facebook. 

The UK Digital, Culture, Media and Sport committee’s frustration at Facebook’s evasive behavior also led it to deploy arcane parliamentary powers to seize a cache of internal Facebook documents from a US lawsuit in a creative attempt to get at the world-view locked inside Zuckerberg’s blue box.

The unvarnished glimpse of Facebook’s business that these papers afforded certainly isn’t pretty… 

US legal discovery appears to be the only reliable external force capable of extracting data from inside the bellow of the nation-sized beast. That’s a problem for democracies. 

So Facebook instructing an ‘oversight board’ of its own making to do anything other than smooth publicity bumps in the road, and pave the way for more Facebook business as usual, is like asking a Koch brothers funded ‘stink tank’ to be independent of fossil fuel interests. The OB is just Facebook’s latest crisis PR tool. More fool anyone who signs up to ink their name to its democratically void rubberstamp.

Dig into the detail of the charter and cracks in the claimed “independence” soon appear.

Aside from the obvious overriding existential points that the board only exists because Facebook exists, making it a dependent function of Facebook whose purpose is to enable its spawning parental system to continue operating; and that it’s funded and charged with chartered purpose by the very same blue-veined god it’s simultaneously supposed to be overseeing (quite the conflict of interest), the charter states that Facebook itself will choose the initial board members. Who will then choose the rest of the first cohort of members.

“To support the initial formation of the board, Facebook will select a group of cochairs. The co-chairs and Facebook will then jointly select candidates for the remainder of the board seats,” it writes in pale grey Facebookese with a tone set to ‘smooth reassurance’ — when the substance of what’s being said should really make you go ‘wtf, how is that even slightly independent?!’

Because the inaugural (Facebook-approved) member cohort will be responsible for the formative case selections — which means they’ll be laying down the foundational ‘case law’ that the board is also bound, per Facebook’s charter, to follow thereafter.

“For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar,” runs an instructive section on the “basis of decision-making”.

The problem here hardly needs spelling out. This isn’t Facebook changing, this is more of the same ‘Facebook first’ ethos which has always driven its content moderation decisions — just now with a highly polished ‘overseen’ sheen.

This isn’t accountability either. It’s Facebook trying to protect its business from actual regulation by creating a blame-shifting firewall to shield its transparency-phobic execs from democratic (and moral) scrutiny. And indeed to shield Zuckerberg & his inner circle from future content scandals that might threaten to rock the throne, a la Cambridge Analytica.

(Judging by other events this week that mission may not be going so well… )

Given the lengths this company is going to to eschew democratic scrutiny — ducking and diving even as it weaves its own faux oversight structure to manage negative PR on its behalf (yep, more fakes!) — you really have to wonder what Facebook is trying to hide.

A moral vacuum the size of a black hole? Or perhaps it’s just trying to buy time to complete its corporate takeover of the democratic world order…

Because of course the Oversight Board can’t set actual Facebook policy. Don’t be ridiculous! It can merely issue policy recommendations — which Facebook can just choose to ignore.

So even if we imagine the OB running years in the future, when it might theoretically be possible its membership has drifted out of Facebook’s comfortable set-up “support” zone, the charter has baked in another firewall that lets Zuckerberg ignore any policy pressure he doesn’t like. Just, y’know, on the off-chance the board gets too independently minded. Truly, there’s nothing to see here.

Entities structured by corporate interests to role-play ‘neutral’ advice or ensure ‘transparent’ oversight — or indeed to promulgate self-interested propaganda dressed in the garb of intellectual expertise — are almost always a stacked trick.

This is why it’s preferable to live in a democracy. And be governed by democratically accountable institutions that are bound by legally enforcement standards of transparency. Though Facebook hopes you’ll be persuaded to vote for manipulation by corporate interest instead.

So while Facebook’s claim that the Oversight Board will operate “transparently” sure sound good it’s also entirely meaningless. These are not legal standards of transparency. Facebook is a business, not a democracy. There are no legal binds here. It’s self regulation. Ergo, a pantomime.

You can see why Facebook avoided actually calling the OB its ‘Supreme Court’; that would have been trolling a little too close to the bone.

Without legal standards of transparency (or indeed democratic accountability) being applied, there are endless opportunities for Facebook’s self interest to infiltrate the claimed separation between oversight board, oversight trust and the rest of its business; to shape and influence case selections, decisions and policy recommendations; and to seed and steer narrative-shaping discussion around hot button speech issues which could help move the angry chatter along — all under the carefully spun cover of ‘independent external oversight’.

No one should be fooled into thinking a Facebook-shaped and funded entity can meaningful hold Facebook to account on anything. Nor, in this case, when it’s been devised to absorb the flak on irreconcilable speech conflicts so Facebook doesn’t have to.

It’s highly doubtful that even a truly independent board cohort slotted into this Zuckerberg PR vehicle could meaningfully influence Facebook’s policy in a more humanitarian direction. Not while its business model is based on mass-scale attention harvesting and privacy-hostile people profiling. The board’s policy recommendations would have to demand a new business model. (To which we already know Facebook’s response: ‘LOL! No.’)

The Oversight Board is just the latest blame-shifting publicity exercise from a company with a user-base as big as a country that gifts it massive resource to throw at its ‘PR problem’ (as Facebook sees it); i.e. how to seem like a good corporate citizen whilst doing everything possible to evade democratic scrutiny and outrun the leash of government regulation. tl;dr: You can’t fix anything if you don’t believe there’s an underlying problem in the first place.

For an example of how the views of a few hand-picked independent experts can be channeled to further a particular corporate agenda look no further than the panel of outsiders Google assembled in Europe in 2014 in response to the European Court of Justice ‘right to be forgotten’ ruling — an unappealable legal decision that ran counter to its business interests.

Google used what it billed as an “advisory committee” of outsiders mostly as a publicity vehicle, holding a large number of public ‘hearings’ where it got to frame a debate and lobby loudly against the law. In such a context Google’s nakedly self-interested critique of EU privacy rights was lent a learned, regionally seasoned dressing of nuanced academic concern, thanks to the outsiders doing time on its platform.

Google also claimed the panel would steer its decision-making process on how to implement the ruling. And in their final report the committee ended up aligning with Google’s preference to only carry out search de-indexing at the European (rather than .com global) domain level. Their full report did contain some dissent. But Google’s preferred policy position won out. (And, yes, there were good people on that Google-devised panel.)

Facebook’s Oversight Board is another such self-interested tech giant stunt. One where Facebook gets to choose whether or not to outsource a few tricky content decisions while making a big show of seeming outward-looking, even as it works to shift and defuse public and political attention from its ongoing lack of democratic accountability.

What’s perhaps most egregious about this latest Facebook charade is it seems intended to shift attention off of the thousands of people Facebook pays to labor daily at the raw coal face of its content business. An outsourced army of voiceless workers who are tasked with moderating at high speed the very worst stuff that’s uploaded to Facebook — exposing themselves to psychological stress, emotional trauma and worse, per multiple media reports.

Why isn’t Facebook announcing a committee to provide that existing expert workforce with a public voice on where its content lines should lie, as well as the power to issue policy recommendations?

It’s impossible to imagine Facebook actively supporting Oversight Board members being selected from among the pool of content moderation contractors it already pays to stop humanity shutting its business down in sheer horror at what’s bubbling up the pipe.

On member qualifications, the Oversight Board charter states: “Members must have demonstrated experience at deliberating thoughtfully and as an open-minded contributor on a team; be skilled at making and explaining decisions based on a set of policies or standards; and have familiarity with matters relating to digital content and governance, including free expression, civic discourse, safety, privacy and technology.”

There’s surely not a Facebook moderator in the whole wide world who couldn’t already lay claim to that skill-set. So perhaps it’s no wonder the company’s ‘Oversight Board’ isn’t taking applications.


TechCrunch

There’s a double standard when it comes to the sexualities of men versus women, trans and gender non-conforming folks. Unbound and Dame Products, two sex tech startups, have teamed up to bring attention to the issue.

By launching a website, “Approved, Not Approved” and staging a protest outside Facebook’s NYC headquarters, the two startups hope to bring more awareness to the company’s advertising guidelines that seem to favor products that cater to cisgender men. The point of the digital campaign is to show how ads for sex toys and products geared toward men are more likely to be approved than those for women, trans or gender non-conforming people.

“For so long, advertisements have been how we continue to reinforce the status quo of what we view as societally desirable and validating,” Dame Products CEO Alexandra Fine told TechCrunch. “Since we’re in a category that’s often denied, we wanted to create an experience that illuminates the disparity.”

On Facebook, for example, it’s prohibitive to promote the sale or use of adult products or services except for ads that pertain to family planning and contraception. The policy also requires that ads for contraceptives cannot focus on sexual pleasure or sexual enhancement, and have to be targeted to people 18 years or older.

“They’re never going to view sexual pleasure as necessary — only functionality as necessary,” Fine said. “And since the functioning only matters for one sex, then we’re just encouraging shitty sex or at least one-sided sex. Healthy sex should be pleasurable sex. That’s really what I think is important.”

Facebook, however, clearly disagrees since it explicitly bans ads relating to sexual pleasure.

“We have had open lines of communication with both companies about our policies and are always taking feedback,” a Facebook spokesperson told TechCrunch. “We are working to further clarify our policies in this space in the near future.”

Unfortunately, there is no telling if and when Facebook and other platforms will change their advertising policies to enable companies like Dame Products and Unbound to reach potential customers through ads.

“I think a lot of us feel like we’ve been silenced by these platforms and they control so much,” Unbound CEO Polly Rodriguez told TechCrunch. “Facebook, Instagram, Pinterest — these are the channels startups live and die by. Not being able to advertise on them is a big deal because, in addition to the policies being biased and genders, it prevents those founders from being able to reach potential customers.”

Unbound CEO Polly Rodriguez. The startup was a finalist at TC Disrupt SF Startup Battlefield finalist in 2018.

In addition to missing out on potential customers, an inability to advertise can have a detrimental effect on a business in terms of raising venture funding.

“I think one of the most frustrating things is trying to raise a round and getting pushback around where you’ll spend the money,” Rodriguez said. “It’s just tough because it’s this vicious cycle where we could be growing at the same rate as a Him or a Roman. It’s definitely in the tens of millions of dollars in terms of foregone profits.”

In addition to the protest, Fine is suing New York City’s Metropolitan Transportation Authority alleging it’s in violation of Dame’s First Amendment rights, the due process clause of the 14th Amendment and the state’s constitutional rights regarding freedom of speech. The lawsuit came in light of the MTA preventing Dame from running its ads on the subway.

Still, despite efforts to squash it, sex tech may finally be getting its moment in the sun. Earlier this month, the sex tech industry had a big win when the organizer of the Consumer Electronics Show finally decided to allow sex tech companies to exhibit and participate in its competition. That came after the Consumer Technology Association, the organizer of CES, royally messed up with sex tech company Lora DiCarlo last year. The CTA revoked an innovation award from the company, which is developing a hands-free device that uses biomimicry and robotics to help women achieve a blended orgasm by simultaneously stimulating the G-spot and the clitoris. In May, CTA re-awarded the company and apologized.

“It’s so rare you see a victory like that and it was because of the press,” Rodriguez said. “It was because it takes. It’s unfortunate these companies don’t do the right thing because it’s the right thing to do. They do the right thing when enough people speak out about it.”


TechCrunch

Created by R the Company. Powered by SiteMuze.