Topics

Latest

AI

Amazon

Article image

Image Credits:Rebecca Portnoff / Bryce Durbin

Apps

Biotech & Health

clime

Rebecca Portnoff

Image Credits:Rebecca Portnoff / Bryce Durbin

Cloud Computing

Commerce

Crypto

endeavour

EVs

Fintech

Fundraising

gizmo

Gaming

Google

Government & Policy

Hardware

Instagram

Layoffs

Media & Entertainment

Meta

Microsoft

privateness

Robotics

security system

Social

place

startup

TikTok

transport

Venture

More from TechCrunch

event

Startup Battlefield

StrictlyVC

Podcasts

Videos

Partner Content

TechCrunch Brand Studio

Crunchboard

meet Us

As a part of TechCrunch ’s ongoingWomen in AI series , which seeks to giveAI - centre womenacademics and others their well - deserved — and overdue — time in the public eye , TechCrunch interviewed Dr. Rebecca Portnoff , who is vice chairman of data skill at the nonprofitThorn , which builds tech to protect children from sexual ill-usage .

She attended Princeton University before receiving her PhD in computer science from the University of California , Berkeley . She has been working her elbow room up the run at Thorn , where she has work since 2016 . She started as a Tennessean inquiry scientist and now , eight year afterwards , leads a team that is probably one of the only in the world dedicate to building machine encyclopaedism and artificial intelligence to stop , forestall , and fight back children from intimate abuse .

“ During my senior year at Princeton , as I was think over what to do after commencement , my sister recommend I read ‘ Half the Sky ’ by Nicholas Kristof and Sheryl WuDunn , which introduce me to the topic of shaver sexual vilification , ” she told TechCrunch , saying the book inspired her to analyze how to make a difference in this space . She went on to write her doctor’s degree dissertation focusing especially on using machine learning and AI in this space .

The mission to protect children

At Thorn , Portnoff ’s squad helps to discover victim , stop revictimization , and prevent the viral spread of sexual abuse material . She conduce the Thorn and All Tech Is Human ’s joint Safety by Design enterprisingness last year , which strives to prevent the great unwashed from using reproductive AI to sexually harm nipper .

“ It was a tremendous lift , collaboratively delineate rule and moderation to prevent generative models from producing abuse material , make such material more dependably detected , and prevent the distribution of those model , services , and apps that are used to grow this ill-usage cloth , then align industry leaders to commit to those standards , ” she recalled . She say she see many mass dedicate to the cause , “ but I ’ve also got more gray pilus than I did at the jump of it all . ”

Using AI to create nonconsensual intimate images has become a big discussion , particularly as AI porn generationsbecome more advanced , as TechCrunch antecedently reported . There is currently nocomprehensive federal lawin place that protect or preclude sexual generative AI persona created of other people without their consent , though individual commonwealth , like Florida , Louisiana , and New Mexico , have passed their own legislating to specifically target AI youngster ill-treatment .

In fact , she said this is one of the most pressing issues face AI as it germinate . “ One in 10 minors account they know of vitrine where their compeer had generated naked imagery of other kids , ” she said .

Join us at TechCrunch Sessions: AI

Exhibit at TechCrunch Sessions: AI

“ We do n’t have to live in this reality and it ’s impossible that we ’ve allowed it to go to this breaker point already . ” She say there are mitigations , however , that can be put in spot to prevent and keep down this abuse . Thorn , for representative , is advocate that technical school companies adopt their refuge - by - aim principles and mitigation , and publicly share how they are foreclose the abuse of their generative AI technologies and products in further child sexual ill-treatment , collaborating with professional organization such as the Institute of Electrical and Electronics Engineers ( IEEE ) and the National Institute of Standards and Technology ( NIST ) to suffer set standards for company that can be used to scrutinize advancement , as well as engaging with policymakers to inform them of how of import this is .

“ statute law grounded in impact will be necessary to bring all companies and stakeholder on board , ” she said .

Working as a woman in AI

As she uprise through the ranks in building AI , Portnoff recalls mass ignoring her advice , asking rather to speak with someone who has a technical ground . “ My answer ? ‘ No worry , you are blab out with someone with a technical background , ’ ” she said .

She said a few things have avail her navigate process in such a male - dominated field : being prepared , acting with assurance , and take on skilful intentions . Being prepared assist her get in rooms with more confidence , while confidence allow her to navigate challenge with curiosity and cheek , “ attempt first to realise and then to be understood , ” she continue .

“ Assuming dependable intent facilitate me set about challenges with forgivingness rather than defensiveness , ” she order . “ If that dependable intention truly is n’t there , it ’ll show finally . ”

Her advice to women seeking to enter AI is to always believe in your ability and meaning . She said it ’s wanton to flow into the trap of letting the assumptions people have about you specify your potential difference , but that everyone ’s voice is going to be needed in this current AI revolution .

“ As ML / AI becomes more incorporate into our human systems , all of us need to work together to ensure it ’s done in a way that builds up our collective flourishing and prioritizes the most vulnerable among us . ”

Building ethical AI

Portnoff said there are many facets to responsible AI , include the need for transparency , beauteousness , reliability , and rubber . “ But all of them have one affair in common , ” she continued . “ Responsibly building ML / AI requires engaging with more stakeholders than just your fellow technologists . ”

This means more fighting listening and coaction . “ If you ’re follow a roadmap for build responsible AI , and you find that you have n’t speak to anyone outside your organization or your engineering squad in the summons , you ’re in all likelihood headed in the wrong direction . ”

And , as investor continue todump billions of dollarsinto AI startups , Portnoff suggested that investor can start looking at duty as early as the due diligence stage , depend at a company ’s commitment to ethics before work an investment , and then require sealed standards to be met . This can “ prevent harm and enable positive emergence . ”

“ There is a bunch of work that needs to be done , ” she said , talking by and large . “ And you may be the one to make it bump . ”