Wij willen met u aan tafel zitten en in een openhartig gesprek uitvinden welke uitdagingen en vragen er bij u spelen om zo, gezamelijk, tot een beste oplossing te komen. Oftewel, hoe kan de techniek u ondersteunen in plaats van dat u de techniek moet ondersteunen.

The startup community has lost another moral leader today.

Leila Janah, a serial entrepreneur who was the CEO and founder of machine learning training data company Samasource, passed away at the age of 37 due to complications from Epithelioid Sarcoma, a form of cancer, according to a statement from the company.

She focused her career on social and ethical entrepreneurship with the goal of ending global poverty, founding three distinct organizations over her career spanning the for-profit and non-profit worlds. She was most well-known for Samasource, which was founded a little more than a decade ago to help machine learning specialists develop better ML models through more complete and ethical training datasets.

Janah and her company were well ahead of their time, as issues related to bias in ML models have become top-of-mind for many product leaders in Silicon Valley today. My TechCrunch colleague Jake Bright had just interviewed Janah a few weeks ago, after Samasource raised more than $ 15 million in venture capital, according to Crunchbase.

In its statement, the company said:

We are all committed to continuing Leila’s work, and to ensuring her legacy and vision is carried out for years to come. To accomplish this, Wendy Gonzalez, longtime business partner and friend to Leila, will take the helm as interim CEO of Samasource. Previously the organization’s COO, Wendy has spent the past five years working alongside Leila to craft Samasource’s vision and strategy.

In addition to Samasource, Janah founded SF-based Samaschool, a 501(c)(3) nonprofit dedicated to helping low-income workers learn critical freelancing skills by helping them negotiate the changing dynamics in the freelance economy. The organization has built partnerships with groups like Goodwill to empower them to offer additional curricular resources within their own existing programs and initiatives.

Janah also founded LXMI, a skin-care brand that emphasized organic and fair-trade ingredients, with a focus on sourcing from low-income women’s cooperatives in East Africa. Founded three years ago, the company raised a seed round from the likes of NEA, Sherpa, and Reid Hoffman according to Crunchbase.

Across all of her initiatives, Janah consistently put the concerns of under-represented people at the forefront, and designed organizations to empower such people in their daily lives. Her entrepreneurial spirit, commitment, and integrity will be sorely missed in the startup community.

Our editor Josh Constine had this to say of Janah’s impact. “Leila was propulsive. Being around her, you’d swear there were suddenly more hours in the day just based on how much she could accomplish. Yet rather than conjuring that energy through ruthless efficiency, she carried on with grace and boundless empathy. Whether for her closest friends or a village of strangers on the other side of the world, she embraced others’ challenges as her own. Leila turned vulnerability into an advantage, making people feel so comfortable in her presence that they could unwind their personal and professional puzzles. Leila is the kind of founder we need more of, and she’ll remain an example of how to do business with heart.”


TechCrunch

We are living through one of the nation’s longest periods of economic growth. Unfortunately, the good times can’t last forever. A recession is likely on the horizon, even if we can’t pinpoint exactly when. Founders can’t afford to wait until the midst of a downturn to figure out their game plans; that would be like initiating swim lessons only after getting dumped in the open ocean.

When recession inevitably strikes, it will be many founders’ — and even many VCs’ — first experiences navigating a downturn. Every startup executive needs a recession playbook. The time to start building it is now.

While recessions make running any business tough, they don’t necessitate doom. I co-founded two separate startups just before downturns struck, yet I successfully navigated one through the 2000 dot-com bust and the second through the 2008 financial crisis. Both companies not only survived but thrived. One went public and the second was acquired by Mastercard.

I hope my lessons learned prove helpful to building your own recession game plan.


TechCrunch

We’re down in Sunnyvale, CA today, where Alchemist Accelerator is hosting a demo day for its most recent batch of companies. This is the 23rd class to graduate from Alchemist, with notable alums including LaunchDarkly, MightyHive, Matternet, and Rigetti Computing. As an enterprise accelerator, Alchemist focuses on companies that make their money from other businesses, rather than consumers.

21 companies presented in all, each getting five minutes to explain their mission to a room full of investors, media, and other founders.

Here are our notes on all 21 companies, in the order in which they presented:

i-50: Uses AI to monitor human actions on production lines, using computer vision to look for errors or abnormalities along the way. Founder Albert Kao says that 68% of manufacturing issues are caused by human error. The company currently has 3 paid pilots, totalling $ 190k in contracts.

Perimeter: A data visualization platform for firefighters and other first responders, allowing them to more quickly input and share information (such as how a fire is spreading) with each other and the public. Projecting $ 1.7M in revenue within 18 months.

Einsite: Computer vision-based analytics for mining and construction. Sensors and cameras are mounted on heavy machines (like dump trucks and excavators). Footage is analyzed in the cloud, with the data ultimately presented to job site managers to help monitor progress and identify issues. Founder Anirudh Reddy says the company will have $ 1.2M in bookings and be up and running on 2100 machines this year.

Mall IQ: A location-based marketing/analytics SDK for retail stores and malls to tie into their apps. Co-founder Batu Sat says they’ve built an “accurate and scalable” method of determining a customer’s indoor position without GPS or additional hardware like Bluetooth beacons.

Ipsum Analytics: Machine learning system meant to predict the outcome of a company’s ongoing legal cases by analyzing the relevant historical cases of a given jurisdiction, judge, etc. First target customer is hedge funds, helping them project how legal outcomes will impact the market.

Vincere Health: Works with insurance companies to pay people to stop smoking. They’ve built an app with companion breathalyzer hardware; each time a user checks in with the breathalyzer to prove they’re smoking less, the user gets paid. They’ve raised $ 400k so far.

Harmonize: A chat bot system for automating HR tasks, built to work with existing platforms like Slack and Microsoft Teams. An employee could, for example, message the bot to request time off — the request is automatically forwarded to their manager, presenting them with one-click approve/deny buttons which handle everything behind the scenes. The company says it currently has 400 paying customers and is seeing $ 500k in ARR, projecting $ 2M ARR in 2020.

Coreshell Technologies: Working on a coating for lithium-Ion batteries which the company says makes them 25% cheaper and 50% faster to produce. The company’s co-founder says they have 11 patents filed, with 2 paid agreements signed and 12 more in the pipeline.

in3D: An SDK for 3D body scanning via smartphone, meant to help apps do things like gather body measurements for custom clothing, allow for virtual clothing try-ons, or create accurate digital avatars for games.

Domatic: “Intelligent power” for new building construction. Pushes both data and low-voltage power over a single “Class 2” wire , making it easier/cheaper for builders to make a building “smart”. Co-founder Jim Baldwin helped build Firewire at Apple, and co-founder Gladys Wong was previously a hardware engineer at Cisco.

MeToo Kit: a kit meant to allow victims of sexual assault or rape to gather evidence through an at-home, self-administered process. Co-founder Madison Campbell says that they’ve seen 100k kits ordered by universities, corporations, non-profits, and military organizations. The company garnered significant controversy in September of 2019 after multiple states issued cease-and-desist letters, with Michigan’s Attorney General arguing that such a kit would not be admissible in court. Campbell told Buzzfeed last year that she would “never stop fighting” for the concept.

AiChemist Metal: Building a thin, lightweight battery made of copper and cellulose “nanofibers”. Co-founder Sergey Lopatin says the company’s solution is 2-3x lighter, stronger, and cheaper than alternatives, and that the company is projecting profitability in 2021. Focusing first on batteries for robotics, flexible displays, and electric vehicles.

Delightree: A task management system for franchises, meant to help owners create and audit to-dos across locations. Monitors online customer reviews, automatically generating potential tasks accordingly. In pilot tests with 3 brands with 16 brands on a waitlist, which the company says translates to about $ 400k in potential ARR.

DigiFabster: A ML-powered “smart quoting” tool for manufacturing shops doing things like CNC machining to make custom parts and components. Currently working with 125 customers, they’re seeing $ 500k in ARR.

NachoNacho: Helps small/medium businesses monitor and manage software subscriptions their employees sign up for. Issues virtual credit cards which small businesses use to sign up for services; you can place budgets on each card, cancel cards, and quickly determine where your money is going. Launched 9 months ago, NachoNacho says it’s currently working with over 1600 businesses.

Zapiens: a virtual assistant-style tool for sharing knowledge within a company, tied into tools like Slack/Salesforce/Microsoft 365. Answers employee questions, or uses its understanding of each employee’s expertise to find someone within the company who can answer the question.

Onebrief: A tool aiming to make military planning more efficient. Co-founder/Army officer Grant Demaree says that much of the military’s planning is buried in Word/Powerpoint documents, with inefficiencies leading to ballooning team sizes. By modernizing the planning approach with a focus on visualization, automation and data re-usability, he says planning teams could be smaller yet more agile.

Perceive: Spatial analytics for retail stores. Builds a sensor that hooks into existing in-store lighting wiring to create a 3D map of stores, analyzing customer movement/behavior (without face recognition or WiFi/beacon tracking) to identify weak spots in store layout or staffing.

Acoustic Wells: IoT devices for monitoring and controlling production from oil fields. Analyzes sound from pipes “ten thousand feet underground” to regulate how a machine is running, optimizing production while minimizing waste. Charges monthly fee per oil well. Currently has letters of intent to roll out their solution in over 1,000 wells.

SocialGlass: A marketplace for government procurement. Lets governments buy goods/services valued under $ 10,000 without going through a bidding process, with SocialGlass guaranteeing they’ve found the cheapest price. Currently working with 50+ suppliers offering 10,000 SKUs.

Applied Particle Technology: Continuous, realtime worker health/safety tracking for industrial environments. Working on wireless, wearable monitors that stream environmental data to identify potential exposure risks. Focusing first on mining and metals industries, later moving into construction, firefighting, and utilities environments.


TechCrunch

Government and policy experts are among the most important people in the future of transportation. Any company pursuing the shared scooters and bikes business, ride-hailing, on-demand shuttles and eventually autonomous vehicles has to have someone, or a team of people, who can work with cities.

Enter Shin-pei Tsay, the director of policy, cities and transportation at Uber . TechCrunch is excited to announce that Tsay will join us onstage at TC Sessions: Mobility, a one-day conference dedicated to the future of mobility and transportation.

If there’s one person who is at the center of this universe, it’s Tsay. In her current role at Uber, she leads a team of issues experts focused on what Uber calls a “sustainable multi-modal urban future.”

Tsay is also a founder. Prior to Uber, she founded a social impact analysis company called Make Public. She was also the deputy executive director of TransitCenter, a national foundation focused on improving urban transportation. She also founded and directed the cities and transportation program under the Energy and Climate Program at the Carnegie Endowment for International Peace.

For the past four years, Shinpei has served as a commissioner for the City of New York Public Design Commission. She is on the board of the national nonprofit In Our Backyard.

Stay tuned, we’ll have more speaker announcements in the coming weeks. In case you missed it, TechCrunch has already announced Ike co-founder and chief engineer Nancy Sun, Waymo’s head of trucking Boris Sofman and Trucks VC’s Reilly Brennan will be participating in TC Sessions: Mobility.

Don’t forget that $ 250 Early-Bird tickets are now on sale — save $ 100 on tickets before prices go up on April 9; book today.

Students, you can grab your tickets for just $ 50 here.


TechCrunch

Did you notice a recent change to how Google search results are displayed on the desktop?

I noticed something last week — thinking there must be some kind of weird bug messing up the browser’s page rendering because suddenly everything looked similar: A homogenous sea of blue text links and favicons that, on such a large expanse of screen, come across as one block of background noise.

I found myself clicking on an ad link — rather than the organic search result I was looking for.

Here, for example, are the top two results for a Google search for flight search engine ‘Kayak’ — with just a tiny ‘Ad’ label to distinguish the click that will make Google money from the click that won’t…

Turns out this is Google’s latest dark pattern: The adtech giant has made organic results even more closely resemble the ads it serves against keyword searches, as writer Craig Mod was quick to highlight in a tweet this week.

Last week, in its own breezy tweet, Google sought to spin the shift as quite the opposite — saying the “new look” presents “site domain names and brand icons prominently, along with a bolded ‘Ad’ label for ads”:

But Google’s explainer is almost a dark pattern in itself.

If you read the text quickly you’d likely come away with the impression that it has made organic search results easier to spot since it’s claiming components of these results now appear more “prominently” in results.

Yet, read it again, and Google is essentially admitting that a parallel emphasis is being placed — one which, when you actually look at the thing, has the effect of flattening the visual distinction between organic search results (which consumers are looking for) and ads (which Google monetizes).

Another eagle-eyed user Twitter, going by the name Luca Masters, chipped into the discussion generated by Mod’s tweet — to point out that the tech giant is “finally coming at this from the other direction”.

‘This’ being deceptive changes to ad labelling; and ‘other direction’ being a reference to how now it’s organic search results being visually tweaked to shrink their difference vs ads.

Google previously laid the groundwork for this latest visual trickery by spending earlier years amending the look of ads to bring them closer in line with the steadfast, cleaner appearance of genuine search results.

Except now it’s fiddling with those too. Hence ‘other direction’.

Masters helpfully quote-tweeted this vintage tweet (from 2016), by journalist Ginny Marvin — which presents a visual history of Google ad labelling in search results that’s aptly titled “color fade”; a reference to the gradual demise of the color-shaded box Google used to apply to clearly distinguish ads in search results.

Those days are long gone now, though.

 

Now a user of Google’s search engine has — essentially — only a favicon between them and an unintended ad click. Squint or you’ll click it.

This visual trickery may be fractionally less confusing in a small screen mobile environment — where Google debuted the change last year. But on a desktop screen these favicons are truly minuscule. And where to click to get actual information starts to feel like a total lottery.

A lottery that’s being stacked in Google’s favor because confused users are likely to end up clicking more ad links than they otherwise would, meaning it cashes in at the expense of web users’ time and energy.

Back in May, when Google pushed this change on mobile users, it touted the tweaks as a way for sites to showcase their own branding, instead of looking like every other blue link on a search result page. But it did so while simultaneously erasing a box-out that it had previously displayed around the label ‘Ad’ to make it stand out.

That made it “harder to differentiate ads and search results,” as we wrote then — predicting it will “likely lead to outcry”.

There were certainly complaints. And there will likely be more now — given the visual flattening of the gap between ad clicks and organic links looks even more confusing for users of Google search on desktop. (Albeit, the slow drip of design change updates also works against mass user outcry.)

We reached out to Google to ask for a response to the latest criticism that the new design for search results makes it almost impossible to distinguish between organic results and ads. But the company ignored repeat requests for comment.

Of course it’s true that plenty of UX design changes face backlash, especially early on. Change in the digital realm is rarely instantly popular. It’s usually more ‘slow burn’ acceptance.

But there’s no consumer-friendly logic to this one. (And the slow burn going on here involves the user being cast in the role of the metaphorical frog.)

Instead, Google is just making it harder for web users to click on the page they’re actually looking for — because, from a revenue-generating perspective, it prefers them to click an ad.

It’s the visual equivalent of a supermarket putting a similarly packaged own-brand right next to some fancy branded shampoo on the shelf — in the hopes a rushed shopper will pluck the wrong one. (Real life dark patterns are indeed a thing.)

It’s also a handy illustration of quite how far away from the user Google’s priorities have shifted, and continue to drift.

“When Google introduced ads, they were clearly marked with a label and a brightly tinted box,” said UX specialist Harry Brignull. “This was in stark contrast to all the other search engines at the time, who were trying to blend paid listings in amongst the organic ones, in an effort to drive clicks and revenue. In those days, Google came across as the most honest search engine on the planet.”

Brignull is well qualified to comment on dark patterns — having been calling out deceptive design since 2010 when he founded darkpatterns.org.

“I first learned about Google in the late 1990s. In those days you learned about the web by reading print magazines, which is charmingly quaint to look back on. I picked up a copy of Wired Magazine and there it was – a sidebar talking about a new search engine called ‘Google’,” he recalled. “Google was amazing. In an era of portals, flash banners and link directories, it went in the opposite direction. It didn’t care about the daft games the other search engines were playing. It didn’t even seem to acknowledge they existed. It didn’t even seem to want to be a business. It was a feat of engineering, and it felt like a public utility.

“The original Google homepage was recognised a guiding light of purism in digital design. Search was provided by an unstyled text field and button. There was nothing else on the homepage. Just the logo. Search results were near-instant and they were just a page of links and summaries – perfection with nothing to add or take away. The back-propagation algorithm they introduced had never been used to index the web before, and it instantly left the competition in the dust. It was proof that engineers could disrupt the rules of the web without needing any suit-wearing executives. Strip out all the crap. Do one thing and do it well.”

“As Google’s ambitions changed, the tinted box started to fade. It’s completely gone now,” Brignull added.

The one thing Google very clearly wants to do well now is serve more ads. It’s chosen to do that deceptively, by steadily — and consistently — degrading the user experience. So a far cry from “public utility”.

And that user-friendly Google of old? Yep, also completely gone.


TechCrunch

Memphis Meats, a developer of technologies to manufacture meat, seafood and poultry from animal cells, has raised $ 161 million in financing from investors including Softbank Group, Norwest and Temasek, the investment fund backed by the government of Singapore.

The investment brings the company’s total financing to $ 180 million. Previous investors include individual and institutional investors like Richard Branson, Bill Gates, Threshold Ventures, Cargill, Tyson Foods, Finistere, Future Ventures, Kimbal Musk, Fifty Years and CPT Capital.

Other companies including Future Meat Technologies, Aleph Farms, Higher Steaks, Mosa Meat and Meatable are pursuing meat grown from cell cultures as a replacement for animal husbandry, whose environmental impact is a large contributor to deforestation and climate change around the world.

Innovations in computational biology, bio-engineering and materials science are creating new opportunities for companies to develop and commercialize technologies that could replace traditional farming with new ways to produce foods that have a much lower carbon footprint and bring about an age of superabundance, according to investors.

The race is on to see who will be the first to market with a product.

“For the entire industry, an investment of this size strengthens confidence that this technology is here today rather than some far-off future endeavor. Once there is a “proof of concept” for cultivated meat — a commercially available product at a reasonable price point — this should accelerate interest and investment in the industry,” said Bruce Friedrich, the executive director of the Good Food Institute, in an email. “This is still an industry that has sprung up almost overnight and it’s important to keep a sense of perspective here. While the idea of cultivated meat has been percolating for close to a century, the very first prototype was only produced six years ago.”


TechCrunch

The UK’s data protection watchdog has today published a set of design standards for Internet services which are intended to help protect the privacy of children online.

The Information Commissioner’s Office (ICO) has been working on the Age Appropriate Design Code since the 2018 update of domestic data protection law — as part of a government push to create ‘world-leading’ standards for children when they’re online.

UK lawmakers have grown increasingly concerned about the ‘datafication’ of children when they go online and may be too young to legally consent to being tracked and profiled under existing European data protection law.

The ICO’s code is comprised of 15 standards of what it calls “age appropriate design” — which the regulator says reflects a “risk-based approach”, including stipulating that setting should be set by default to ‘high privacy’; that only the minimum amount of data needed to provide the service should be collected and retained; and that children’s data should not be shared unless there’s a reason to do so that’s in their best interests.

Profiling should also be off by default. While the code also takes aim at dark pattern UI designs that seek to manipulate user actions against their own interests, saying “nudge techniques” should not be used to “lead or encourage children to provide unnecessary personal data or weaken or turn off their privacy protections”.

“The focus is on providing default settings which ensures that children have the best possible access to online services whilst minimising data collection and use, by default,” the regulator writes in an executive summary.

While the age appropriate design code is focused on protecting children it is applies to a very broad range of online services — with the regulator noting that “the majority of online services that children use are covered” and also stipulating “this code applies if children are likely to use your service” [emphasis ours].

This means it could be applied to anything from games, to social media platforms to fitness apps to educational websites and on-demand streaming services — if they’re available to UK users.

“We consider that for a service to be ‘likely’ to be accessed [by children], the possibility of this happening needs to be more probable than not. This recognises the intention of Parliament to cover services that children use in reality, but does not extend the definition to cover all services that children could possibly access,” the ICO adds.

Here are the 15 standards in full as the regulator describes them:

  1. Best interests of the child: The best interests of the child should be a primary consideration when you design and develop online services likely to be accessed by a child.
  2. Data protection impact assessments: Undertake a DPIA to assess and mitigate risks to the rights and freedoms of children who are likely to access your service, which arise from your data processing. Take into account differing ages, capacities and development needs and ensure that your DPIA builds in compliance
    with this code.
  3. Age appropriate application: Take a risk-based approach to recognising the age of individual users and ensure you effectively apply the standards in this code to child users. Either establish age with a level of certainty that is appropriate to the risks to the rights and freedoms of children that arise from your data processing, or apply the standards in this code to all your users instead.
  4. Transparency: The privacy information you provide to users, and other published terms, policies and community standards, must be concise, prominent and in clear language suited to the age of the child. Provide additional specific ‘bite-sized’ explanations about how you use personal data at the point that use is activated.
  5. Detrimental use of data: Do not use children’s personal data in ways that have been shown to be detrimental to their wellbeing, or that go against industry codes of practice, other regulatory provisions or Government advice.
  6. Policies and community standards: Uphold your own published terms, policies and community standards (including but not limited to privacy policies, age restriction, behaviour rules and content policies).
  7. Default settings: Settings must be ‘high privacy’ by default (unless you can demonstrate a compelling reason for a different default setting, taking account of the best interests of the child).
  8. Data minimisation: Collect and retain only the minimum amount of personal data you need to provide the elements of your service in which a child is actively and knowingly engaged. Give children separate choices over which elements they wish to activate.
  9. Data sharing: Do not disclose children’s data unless you can demonstrate a compelling reason to do so, taking account of the best interests of the child.
  10. Geolocation: Switch geolocation options off by default (unless you can demonstrate a compelling reason for geolocation to be switched on by default, taking account of the best interests of the child). Provide an obvious sign for children when location tracking is active. Options which make a child’s location visible to others must default back to ‘off’ at the end of each session.
  11. Parental controls: If you provide parental controls, give the child age appropriate information about this. If your online service allows a parent or carer to monitor their child’s online activity or track their location, provide an obvious sign to the child when they are being monitored.
  12. Profiling: Switch options which use profiling ‘off’ by default (unless you can demonstrate a compelling reason for profiling to be on by default, taking account of the best interests of the child). Only allow profiling if you have appropriate measures in place to protect the child from any harmful effects (in particular, being fed content that is detrimental to their health or wellbeing).
  13. Nudge techniques: Do not use nudge techniques to lead or encourage children to provide unnecessary personal data or weaken or turn off their privacy protections.
  14. Connected toys and devices: If you provide a connected toy or device ensure you include effective tools to enable conformance to this code.
  15. Online tools: Provide prominent and accessible tools to help children exercise their data protection rights and report concerns.

The Age Appropriate Design Code also defines children as under the age of 18 — which offers a higher bar than current UK data protection law which, for example, puts only a 13-year-age limit for children to be legally able to give their consent to being tracked online.

So — assuming (very wildly) — that Internet services were to suddenly decide to follow the code to the letter, setting trackers off by default and not nudging users to weaken privacy-protecting defaults by manipulating them to give up more data, the code could — in theory — raise the level of privacy both children and adults typically get online.

However it’s not legally binding — so there’s a pretty fat chance of that.

Although the regulator does make a point of noting that the standards in the code are backed by existing data protection laws, which it does regulate and can legally enforceable (and which include clear principles like ‘privacy by design and default’) — pointing out it has powers to take action against law breakers, including “tough sanctions” such as orders to stop processing data and fines of up to 4% of a company’s global turnover.

So, in a way, the regulator appears to be saying: ‘Are you feeling lucky data punk?’

Last April the UK government published a white paper setting out its proposals for regulating a range of online harms — including seeking to address concern about inappropriate material that’s available on the Internet being accessed by children.

The ICO’s Age Appropriate Design Code is intended to support that effort. So there’s also a chance that some of the same sorts of stipulations could be baked into the planned online harms bill.

“This is not, and will not be, ‘law’. It is just a code of practice,” said Neil Brown, an Internet, telecoms and tech lawyer at Decoded Legal, discussing the likely impact of the suggested standards. “It shows the direction of the ICO’s thinking, and its expectations, and the ICO has to have regard to it when it takes enforcement action but it’s not something with which an organisation needs to comply as such. They need to comply with the law, which is the GDPR [General Data Protection Regulation] and the DPA [Data Protection Act] 2018.

“The code of practice sits under the DPA 2018, so companies which are within the scope of that are likely to want to understand what it says. The DPA 2018 and the UK GDPR (the version of the GDPR which will be in place after Brexit) covers controllers established in the UK, as well as overseas controllers which target services to people in the UK or monitor the behaviour of people in the UK. Merely making a service available to people in the UK should not be sufficient.”

“Overall, this is consistent with the general direction of travel for online services, and the perception that more needs to be done to protect children online,” Brown also told us.

“Right now, online services should be working out how to comply with the GDPR, the ePrivacy rules, and any other applicable laws. The obligation to comply with those laws does not change because of today’s code of practice. Rather, the code of practice shows the ICO’s thinking on what compliance might look like (and, possibly, goldplates some of the requirements of the law too).”

Organizations that choose to take note of the code — and are in a position to be able to demonstrate they’ve followed its standards — stand a better chance of persuading the regulator they’ve complied with relevant privacy laws, per Brown.

“Conversely, if they want to say that they comply with the law but not with the code, that is (legally) possible, but might be more of a struggle in terms of engagement with the ICO,” he added.

Zooming back out, the government said last fall that it’s committed to publishing draft online harms legislation for pre-legislative scrutiny “at pace”.

But at the same time it dropped a controversial plan included in a 2017 piece of digital legislation which would have made age checks for accessing online pornography mandatory — saying it wanted to focus on a developing “the most comprehensive approach possible to protecting children”, i.e. via the online harms bill.

How comprehensive the touted ‘child protections’ will end up being remains to be seen.

Brown suggests age verification could come through as a “general requirement”, given the age verification component of the Digital Economy Act 2017 was dropped — and “the government has said that these will be swept up in the broader online harms piece”.

The government has also been consulting with tech companies on possible ways to implement age verification online.

However the difficulties of regulating perpetually iterating Internet services — many of which are also operated by companies based outside the UK — have been writ large for years. (And are now mired in geopolitics.)

While the enforcement of existing European digital privacy laws remains, to put it politely, a work in progress


TechCrunch

Created by R the Company. Powered by SiteMuze.