Preview Mode Links will not work in preview mode

Transforming Work with Sophie Wade


May 17, 2024

Juliette Powell is Founder and Managing Partner of Kleiner Powell International, a consultancy working at the intersection of responsible technology and business. She is co-author of “The AI Dilemma: 7 Principles for Responsible Technology.” Juliette brings rich technology research and innovation experience to evaluate our evolving landscape as we anticipate AI integration. She explains her core concerns—what we need to pay attention and lean into. She discusses the importance of personal data ownership, creative friction, digital trust, and logic. Juliette explains how diverse contributions diminish divergent, asymmetric trajectories, so we all need to be actively involved. 

 

 

TAKEAWAYS

 

[02:30] Monopoly is Juliette’s favorite game as a kid, showing how you can change your circumstances.

 

[02:50] Juliette studies finance and international business to understand global interconnectedness.

 

[03:15] At university, Juliette develops a TV career focusing on the business side of media.

 

[04:32] Interviewing Janet Jackson and Nelson Mandela reveals juxtaposed insecurity and confidence.

 

[07:30] Juliette’s first book results from her involvement with TED’s original founder producing the conference and meeting visionary thinkers.

 

[08:10] Transitioning from TV, Juliette explores technologies and the rise of social media.

 

[10:25] Citizen journalism and political messaging delivered using digital channels fascinates Juliette.

 

[12:10] Juliette tries to lead as her whole self, seeing people disconnecting their work/non-work lives.

 

[13:20] Where engineers can experience misalignment making decisions in their AI-related work.

 

[14:20] Juliette highlights those who live holistically as fully integrated people in her first book.

 

[15:00] Integrated work/life experienced early on meeting a couple working remotely in Thailand.

 

[16:50] Early career motivation to find work thinking about Maslow’s hierarchy of needs.

 

[18:58] How the internet extended possibilities beyond someone’s local geography.

 

[19:50] Ecosystem pressures raise mental health issues and people trying to survive not thrive.

 

[20:50] Navigating uncertainty—personally and professionally—requires having Plan A, B, C, and D.

 

[21:44] Juliette founded the Gathering to ensure diversity and avoid past mistakes in tech development.

 

[24:41] At TED, there is no separation between the expertise on stage and the audience.

 

[26:04] Turing AI and WeTheData.org focus on the personal data ecosystem, ownership, and ethical use.

 

[27:48] Research reveals four grand challenges include digital trust and digital infrastructure/access.

 

[29:30] An ‘eBay for data’ to aggregate and monetize personal data as Finns do.

 

[31:31] Research on Americans’ and Europeans’ different attitudes to their personal data.

 

[35:26] Most of Juliette’s NYU students are terrified of the potential impact of AI on their skills.

 

[36:25] Students’ potential questions ‘Will I have meaning? Can I contribute anything?’

 

[37:40] Juliette teaches students research methods to reduce fear and build confidence.

 

[41:30] The importance of creative friction to reconnect across seamless technology divides.

 

[42:45] Taking a moment to rise above the sand, things have changed a lot, probably within yourself.

 

[43:40] Diverse teams earn the most as they take the longest time to deliberate.

 

[44:45] With diverse debate, deliberating longer, with ongoing feedback, we can create better AI systems.

 

[45:53] Bias is part of human nature, so how we can reduce asymmetry of power?

 

[49:00] If we wake up to the power we have and give away, what we can do with that power.

 

[50:08] Juliette is excited to be alive right now when we are shaping the future such as digital infrastructure, digital literacy, and digital trust.

 

[50:40] Historically, curators of knowledge have been our sources of truth.

 

[53:05] We must be able to manage all this uncertainty on the individual level as a community.

 

[53:45] The Four Logics framework: government, corporate, engineering, and social justice logic.

 

[54:35] Increasing awareness of misalignment between employees’ morals and employer brands.

 

[55:47] Checking on personal values, culture, and vision that enable fulfillment.

 

[56:33] How reducing human biases with AI leads to other biases.

 

[57:27] Encourage employee experimentation with AI and launch internal challenges.

 

 

RESOURCES

 

Juliette Powell on LinkedIn

Juliette Powell’s website

Kliener Powell International’s website

The AI Dilemma: 7 Principles for Responsible Technology" co-authored by Juliette

Juliette’s first book. “33 Million People in the Room: How to Create, Influence, and Run a Successful Business with Social Networking

Juliette’s co-authored book “The AI Dilemma: 7 Principles for Responsible Technology.”

 

 

 

QUOTES (edited)

 

"I've always been of the perspective that I'm a whole person. There are many different parts to my whole person, but nonetheless, I try to think of myself holistically as I navigate the world."

 

"Creative friction can only come from deep diversity. The more diverse, the more they produce questions, the longer it takes to deliberate, but the better the outcomes."

 

"We need to take responsibility and intentionally co-create with AI to ensure diverse perspectives are debated, increasing initial friction to reduce asymmetries and improve capabilities and relevance."

 

"Digital trust is kind of key. If we want data, personal data, to work for everyone on the planet, and not just the usual suspects, we need to address digital trust and infrastructure."

 

"If you feel that your personal morals are being confronted by what you're being asked to do at work, now is the time to recognize that disalignment and seek a place where you can be fulfilled and work on meaningful things."

 

"I'm excited about shaping AI's future because we are the generations that get to shape it. The decisions we make now will determine where digital trust will be in the next hundred years."

 

There is expertise in the everyday person. We don't necessarily reward financially or recognize that, but that tacit knowledge is invaluable.”

 

“If we take longer to deliberate around our AI systems in their specific use cases and context, bring in the various communities that will be affected before we start building them, and deploy them constantly incorporating that feedback, we'd have much better systems that would work for far more people.”

 

“If we all woke up a little bit more to the kind of power that we give away, then we could also realize the kind of power that we actually have if we decide to do something about it.”

 

We have to be able to manage all this uncertainty on the individual level as a community.”

 

"If you feel that your personal morals are being confronted by what you're being asked to do at work, now is the time to recognize that disalignment and seek a place where you can be fulfilled and work on meaningful things."

 

"I'm excited about shaping AI's future because we are the generations that get to shape it. The decisions we make now will determine where digital trust will be in the next hundred years."

 

There is expertise in the everyday person. We don't necessarily reward financially or recognize that, but that tacit knowledge is invaluable.”