ep304 nova chat - ai in the c-suite The Leader Assistant Podcast

This episode features a Nova Chat session hosted by Nova founder and CEO Maggie Olson.

In this conversation, Maggie leads a deep dive into the next five years of the AI landscape in the C-Suite, with panelists Lawrence Coburn, Jessica McBride, and Fiona Young examining the continued importance of AI for CEOs, EAs, and CoSs and sharing helpful advice for navigating new AI platforms in the C-Suite.

A few topics the group discuss…

✴️ How AI affects CEOs, EAs, CoSs

✴️ The AI landscape over the next 5 years

✴️ The continued relevance of C-Suite support staff

✴️ Navigating AI as a C-Suite user

✴️ C-Suite AI tools on the market

…and more! ​

ABOUT FIONA

Fiona Young Headshot The Leader Assistant Podcast

Fiona Young is the founder of Carve, a series of live digital courses for executive assistants to learn AI, create capacity and develop into a strategic assistant — carving out their career growth in the process. Before launching Carve, Fiona spent five years leading learning programs at Hive Learning, the b2b peer learning app. She previously ran Learning & Development for Blenheim Chalcot, one of the world’s most successful venture builders, overseeing group learning strategy and programs for 3,000 people across 25 ventures. Fiona started her career as an executive assistant in entrepreneurial businesses.

ABOUT JESSICA

Jessica McBride headshot leader assistant feature

Jessica McBride is the brains behind Tech Savvy Assistant, a go-to hub for modernizing admin work through tech. With over 10 years in the admin field, she knows the ins and outs of the job and how tech can make it better. Jessica is all about teaching. She’s on a mission to help admin pros get tech-savvy to make their lives easier and up their game. She does this through various channels, but speaking events are her forte. Notably, she was a speaker at the 2023 Administrative Professional Conference in Las Vegas. Her knack for breaking down complex tech topics and her engaging style make her a popular speaker.

ABOUT LAWRENCE

Lawrence Coburn - Headshot The Leader Assistant Podcast

Lawrence Coburn is the CEO and co-founder of Ambient, an AI Assistant purpose-built for the Chief of Staff. Previously he co-founded Twine, a leader in networking software, and DoubleDutch, the market leader for mobile event apps. Lawrence lives in the Mission District of San Francisco with his entrepreneur wife and 13-year-old daughter. When he’s not working, you can find him riding a bicycle, or playing tennis with his daughter.

ABOUT MAGGIE

Maggie Olson Headshot The Leader Assistant Podcast new

Maggie Olson is the Founder of Nova Chief of Staff, the premier destination for Chief of Staff education and development. As the first Chief of Staff to a president at a Fortune 40 company — who led a multibillion-dollar business with 5,000+ employees — Maggie built the president’s Chief of Staff model from the ground up. Maggie has 20 years’ experience leading large teams and has spent her career focused on both customer and employee experience at companies including T-Mobile, Nordstrom, and Starbucks. In addition to operating the Nova Chief of Staff Certification course, Maggie is a fractional Chief of Staff focused on helping mission-driven, for-profit startup founders scale their businesses quickly. In her spare time, Maggie loves spending time outside with her husband, their animals, and their 1-year-old, Max!

–––
THE LEADER ASSISTANT PODCAST IS PRESENTED BY NOVA CHIEF OF STAFF

Nova Chief of Staff Logo - Black Font

Calling all Executive Assistants: Are you looking for a way to elevate your skills or earn that promotion you’ve been eyeing? Nova Chief of Staff’s online certification course provides you with the knowledge and confidence you need to stand out on the job. Whether you want to land your dream position or level up in your current role, Nova’s self-paced course gives you hands-on practice doing what Chiefs of Staff do every day.

Visit leaderassistant.com/nova to learn more and secure your spot!

–––
THE LEADER ASSISTANT PREMIUM MEMBERSHIP

To learn more about how you can join growth-minded Leader Assistants, check out our Leader Assistant Premium Membership for ongoing training, coaching, and community.

THE LEADER ASSISTANT BOOK

Download the first 3 chapters of The Leader Assistant: Four Pillars of Game-Changing Assistant for FREE here or buy it on Amazon and listen to the audiobook on Audible. Also, check out the companion study guide, The Leader Assistant Workbook, to dig deeper.

LEADER ASSISTANT LIVE EVENTS

Check out our constantly updated schedule of events for admins and assistants at LeaderAssistantLive.com.

JOIN THE FREE COMMUNITY

Join the Leader Assistant Global Community for bonus content, job opportunities, and to network with other assistants who are committed to becoming leaders!

SUBSCRIBE

Subscribe to The Leader Assistant Podcast so you don’t miss new episodes!

You can find the show on Apple PodcastsSpotifyGoogle PodcastsPandora, and Stitcher.

Join my email list here if you want to get an email when a new episode goes live.

LEAVE A REVIEW

If you’re enjoying the podcast, please take 2 minutes to rate and review the show on Apple Podcasts here. Each review helps me stay motivated to keep the show going!

THE LEADER ASSISTANT BOOK

Download the first 3 chapters of The Leader Assistant: Four Pillars of Game-Changing Assistant for FREE here or buy it on Amazon and listen to the audiobook on Audible. Also, check out the companion study guide, The Leader Assistant Workbook, to dig deeper.

LEADER ASSISTANT LIVE EVENTS

Check out our constantly updated schedule of events for admins and assistants at LeaderAssistantLive.com.

JOIN THE FREE COMMUNITY

Join the Leader Assistant Global Community for bonus content, job opportunities, and to network with other assistants who are committed to becoming leaders!

SUBSCRIBE

Subscribe to The Leader Assistant Podcast so you don’t miss new episodes!

You can find the show on Apple PodcastsSpotifyGoogle Podcasts, Pandora, and Stitcher.

Join my email list here if you want to get an email when a new episode goes live.

LEAVE A REVIEW

If you’re enjoying the podcast, please take 2 minutes to rate and review the show on Apple Podcasts here. Each review helps me stay motivated to keep the show going!

–––
EPISODE TRANSCRIPT

00:00:00.020 –> 00:00:03.540
JEREMY: Hey, friends, welcome to episode 304 of The Leader Assistant Podcast.

00:00:03.540 –> 00:00:10.500
JEREMY: It’s your host, Jeremy Burrows, and I’m excited to have the opportunity to share another Nova Chat conversation.

00:00:10.500 –> 00:00:15.960
JEREMY: And I’m going to keep my intro short because there’s a lot of great insight in this conversation.

00:00:15.960 –> 00:00:19.560
JEREMY: And it’s all about artificial intelligence in the C-suite.

00:00:19.560 –> 00:00:25.800
JEREMY: And Maggie Olson talks with Lawrence Coburn, Jessica McBride and Fiona Young.

00:00:25.800 –> 00:00:27.520
JEREMY: So I hope you enjoy this conversation.

00:00:27.640 –> 00:00:30.340
JEREMY: Again, another Nova Chat with my friend Maggie Olson.

00:00:30.340 –> 00:00:37.080
JEREMY: You can check out the show notes at leaderassistant.com/304 to find out about all of those featured in this chat.

00:00:43.242 –> 00:00:50.842
<v SPEAKER_3>The Leader Assistant Podcast exists to encourage and challenge assistants to become confident, game-changing leader assistants.

00:01:00.292 –> 00:01:07.532
JEREMY: Hey, Leader Assistants, have you heard the Nova Chief of Staff Certification course is about to see a price increase?

00:01:07.532 –> 00:01:14.432
JEREMY: But don’t worry, you can enroll now, lock in the current rate, and start whenever you’re ready with lifetime access.

00:01:14.432 –> 00:01:18.472
JEREMY: Nova’s mission is to give you the ultimate student experience.

00:01:18.472 –> 00:01:31.892
JEREMY: They’ve packed the course with dozens of templates, self-paced learning, hands-on practice, multiple instructor touchpoints, peer engagement, and even guest-authored assignments.

00:01:31.892 –> 00:01:39.112
JEREMY: With over 500 students across 22 countries, Nova is the top spot for Chief of Staff Learning and Development.

00:01:39.112 –> 00:01:40.092
JEREMY: Don’t wait.

00:01:40.092 –> 00:01:43.292
JEREMY: Enroll today and join the community at leaderassistant.com/nova.

00:01:45.592 –> 00:01:47.892
MAGGIE: Thank you guys all for being here.

00:01:47.892 –> 00:01:48.912
MAGGIE: We’re all shocked.

00:01:48.912 –> 00:01:53.552
MAGGIE: We had 600 plus people register for this event, and the list continues to grow.

00:01:53.552 –> 00:02:01.472
MAGGIE: I think we’ve got almost 150 people joined at this point, and we will absolutely send this recording out to everybody who’s registered.

00:02:01.472 –> 00:02:02.492
MAGGIE: So don’t worry about that.

00:02:02.492 –> 00:02:05.432
MAGGIE: You will get the recording in the next day or two.

00:02:05.432 –> 00:02:07.012
MAGGIE: And my name is Maggie.

00:02:07.012 –> 00:02:10.472
MAGGIE: I am the founder of Nova Chief of Staff.

00:02:10.472 –> 00:02:12.972
MAGGIE: You can learn more about us at novachiefofstaff.com.

00:02:12.972 –> 00:02:19.892
MAGGIE: But today is really all about this very relevant discussion around AI use in the C-suite.

00:02:19.892 –> 00:02:34.852
MAGGIE: Specifically, we’re going to talk about Chiefs of Staff, Executive Assistants, CEOs, the AI landscape over the next five years, the actual practical tools and things that you could all be doing right now if you’re not already.

00:02:34.852 –> 00:02:40.032
MAGGIE: In regards to AI, we’re going to start very generally around like what are the basics?

00:02:40.032 –> 00:02:40.772
MAGGIE: What is AI?

00:02:40.772 –> 00:02:42.932
MAGGIE: What’s generative AI?

00:02:42.932 –> 00:02:56.172
MAGGIE: And then we’ll leave you all, we’ll send out in our email with recording some tools from Fiona, Lawrence and Jessica to show you some quick and easy things that AI is already doing and that you can be doing with AI.

00:02:56.172 –> 00:03:03.052
MAGGIE: So introductions, Lawrence Coburn is the CEO and co-founder of Ambient AI.

00:03:03.052 –> 00:03:08.392
MAGGIE: And Ambient is an AI assistant purpose-built for the chief of staff.

00:03:08.392 –> 00:03:10.112
MAGGIE: It’s very exciting.

00:03:10.112 –> 00:03:16.392
MAGGIE: Previously, he co-founded Twine, a leader in network software and Double Dutch, the market leader for mobile events.

00:03:16.392 –> 00:03:18.852
MAGGIE: Lawrence lives in the Mission District of San Francisco.

00:03:18.852 –> 00:03:22.692
MAGGIE: If you missed that earlier, we were doing a little bit of where you located on our panel here.

00:03:23.452 –> 00:03:28.812
MAGGIE: And he lives with his entrepreneur wife and their 13-year-old daughter who’s a tennis champion.

00:03:28.812 –> 00:03:32.852
MAGGIE: When he’s not working, you can find him riding a bicycle or playing tennis with his daughter.

00:03:32.852 –> 00:03:34.812
MAGGIE: So moving on to Ms.

00:03:34.812 –> 00:03:36.172
MAGGIE: Jessica McBride.

00:03:36.172 –> 00:03:44.792
MAGGIE: Jessica is the brains behind Tech Savvy Assistant, a go-to hub for modernizing admin work through tech with over 10 years in the admin field.

00:03:45.272 –> 00:03:49.372
MAGGIE: She knows the ins and outs of the job and how tech can make it better.

00:03:49.372 –> 00:03:50.592
MAGGIE: Jessica is all about teaching.

00:03:50.892 –> 00:03:56.432
MAGGIE: So she’s on a mission to help admin pros get tech savvy and make their lives easier and up their game.

00:03:56.432 –> 00:04:00.132
MAGGIE: She does this through various channels, but speaking events are her thing.

00:04:00.132 –> 00:04:10.612
MAGGIE: So notably, she was a speaker at the 2023 Admin Professional Conference in Las Vegas, and her knack for breaking down complex tech topics and her engaging style make her a popular speaker.

00:04:10.612 –> 00:04:13.512
MAGGIE: So we’re very happy to have both Jessica and Lawrence.

00:04:13.512 –> 00:04:15.252
MAGGIE: And finally, Fiona.

00:04:15.252 –> 00:04:28.132
MAGGIE: So Fiona is the founder of CARVE, a series of live digital courses for executive assistants to learn AI, create capacity and develop into strategic assistants, carving out their career growth in the process.

00:04:28.132 –> 00:04:35.452
MAGGIE: So before launching CARVE, Fiona spent five years leading learning programs at Hive Learning, the B2B peer learning app.

00:04:35.452 –> 00:04:46.192
MAGGIE: She previously ran L&D for Blenheim Chalkut, one of the world’s most successful venture builders overseeing group learning strategy and programs for 3,000 people across 25 ventures.

00:04:46.692 –> 00:04:51.092
MAGGIE: Fiona started her career as an executive assistant in entrepreneurial businesses.

00:04:51.092 –> 00:04:52.112
MAGGIE: So again, I’m Maggie.

00:04:52.112 –> 00:04:58.812
MAGGIE: I am not an AI expert, but I am going to be moderating this panel with these wonderful AI experts here.

00:04:58.812 –> 00:05:00.392
MAGGIE: So thanks again, everybody, for joining us.

00:05:00.392 –> 00:05:03.692
MAGGIE: And we are going to dive right in.

00:05:03.692 –> 00:05:06.772
MAGGIE: So Lawrence, we are going to start with you.

00:05:06.772 –> 00:05:10.392
MAGGIE: Can you, like, let’s just break it down back to basics.

00:05:10.392 –> 00:05:17.812
MAGGIE: A lot of us probably weren’t even talking about AI until a few months ago or maybe until a few weeks ago when we saw this panel come up.

00:05:17.812 –> 00:05:21.252
MAGGIE: What is AI and what is generative AI?

00:05:21.252 –> 00:05:23.112
MAGGIE: What’s the difference?

00:05:23.112 –> 00:05:23.432
LAWRENCE: Yeah.

00:05:23.712 –> 00:05:28.352
LAWRENCE: So I think AI has been around in some format for 50, 60 years.

00:05:28.352 –> 00:05:32.032
LAWRENCE: So the work has been happening in this domain for a long time.

00:05:32.032 –> 00:05:36.012
LAWRENCE: Generative AI is a class of AI technologies.

00:05:36.012 –> 00:05:46.672
LAWRENCE: And so just to sort of set some context, like some of the things that you might have heard of before, computer vision, the ability for computers to sort of recognize, like this is a picture of a cat.

00:05:46.672 –> 00:05:52.652
LAWRENCE: Machine learning, the ability for computers to make predictions about what’s going to happen based on a dataset.

00:05:52.652 –> 00:06:00.292
LAWRENCE: But the big jump that’s happening with generative AI is the ability for computers to create new content.

00:06:00.292 –> 00:06:06.352
LAWRENCE: And content could be an article, it could be an image, it could be a song.

00:06:06.352 –> 00:06:16.912
LAWRENCE: And so the ability to sort of conjure up out of nothing, a song that never existed before, a piece of art that never existed before, or a blog post that never existed before.

00:06:16.912 –> 00:06:18.272
LAWRENCE: This is the new thing.

00:06:18.272 –> 00:06:23.432
LAWRENCE: And this is the thing that’s starting to approach sort of human levels of creativity.

00:06:23.432 –> 00:06:27.412
LAWRENCE: And that’s, I think, why the world is getting really excited right now.

00:06:28.532 –> 00:06:36.912
MAGGIE: Okay, so can you kind of maybe go back a little bit and just help us understand the difference between AI and then what’s generative AI?

00:06:36.912 –> 00:06:38.932
MAGGIE: I hear both of those terms a lot.

00:06:38.932 –> 00:06:46.952
LAWRENCE: Yeah, so I mean, the easiest way my co-founder likes to describe it, which I think is smart, is like previous flavors of AI have been about analysis.

00:06:46.952 –> 00:06:49.292
LAWRENCE: Like think about an analyst.

00:06:49.292 –> 00:06:53.152
LAWRENCE: Generative AI is about creator, like creating new stuff.

00:06:53.152 –> 00:06:54.852
LAWRENCE: And that’s human-grade content.

00:06:54.852 –> 00:06:56.232
MAGGIE: Got it.

00:06:56.232 –> 00:06:57.532
MAGGIE: Okay, okay.

00:06:57.532 –> 00:07:04.272
MAGGIE: Fiona, do you have anything to add there in terms of just the basic definition around AI and generative AI to help our audience?

00:07:04.272 –> 00:07:09.712
FIONA: Yeah, I think that differentiation between analyst and creation is a good one.

00:07:09.952 –> 00:07:18.312
FIONA: I would just say on AI specifically, you would imagine that these are typically systems that were making decisions.

00:07:18.312 –> 00:07:24.372
FIONA: So they’re taking data inputs and they are able to autonomously make decisions using that data.

00:07:24.372 –> 00:07:31.492
FIONA: For example, think about some of the technology that’s been around for decades, like credit card fraud detection, right?

00:07:31.492 –> 00:07:38.012
FIONA: I mean, I can remember being abroad when I was a kid and having our credit card stopped because, you know, suspicious activity.

00:07:38.012 –> 00:07:40.552
FIONA: So, you know, that stuff has been around ages.

00:07:40.552 –> 00:08:04.032
FIONA: But what’s new is this ability for the machine to be able to create totally new content and really everything from text, images, voice, code, you know, and that is really game-changing because for the first time ever, it’s really threatened what you might call white collar jobs, you know, death-based knowledge workers.

00:08:04.032 –> 00:08:15.832
FIONA: Although obviously automation is old news for folks in front-line positions, I think this is the first time for those of us working in death-based jobs that we’re thinking, oh, shit, this could actually take my job.

00:08:15.832 –> 00:08:17.112
MAGGIE: Interesting.

00:08:17.112 –> 00:08:22.032
MAGGIE: Jessica, since we’re just starting out here, I do want to hear from all of you from like a definition perspective.

00:08:22.032 –> 00:08:26.112
MAGGIE: Anything to add or anything that you help teach your students around?

00:08:26.112 –> 00:08:29.212
MAGGIE: How do we understand just AI from a basic level nature?

00:08:30.272 –> 00:08:33.652
JESSICA: Yeah, so the way that I think about it is like we’ve interacted with AI.

00:08:33.652 –> 00:08:36.352
JESSICA: Like Lawrence said, it’s been around for 50, 60 years.

00:08:36.352 –> 00:08:43.212
JESSICA: There’s been several different versions of it that have slowly integrated more and more into our day-to-day life.

00:08:43.212 –> 00:08:49.192
JESSICA: Like all of us have interacted with like Google or Siri or Alexa, and that’s a very basic version of AI.

00:08:49.192 –> 00:08:54.212
JESSICA: You’re giving it a command, it’s processing that data, and then it’s just giving you a response.

00:08:54.212 –> 00:09:01.512
JESSICA: And never before have you been able to give it an idea and then bring that idea to life, for example.

00:09:01.512 –> 00:09:05.532
JESSICA: It’s been the difference of just a yes, no, and a yes, and.

00:09:05.532 –> 00:09:12.252
JESSICA: And I think that when you’re learning to use it, it’s saying like, what am I wanting to create here?

00:09:12.252 –> 00:09:13.912
JESSICA: What am I wanting to explore?

00:09:13.912 –> 00:09:27.772
JESSICA: And using it really as a thought partner is always how I’ve described, like using ChatGBT or like large language models is it’s digging into it and creating something with yourself and using it to boost that.

00:09:27.772 –> 00:09:34.772
JESSICA: So it’s just it’s expanding and empowering us in a way that we haven’t been able to utilize before.

00:09:34.772 –> 00:09:36.052
MAGGIE: Okay, yeah, I really like that.

00:09:36.052 –> 00:09:39.672
MAGGIE: You’re giving an idea and then it’s bringing it to life.

00:09:39.672 –> 00:09:41.832
MAGGIE: And that is the generative part of AI.

00:09:41.832 –> 00:09:42.992
MAGGIE: Does that sound right?

00:09:42.992 –> 00:09:43.132
MAGGIE: Yeah.

00:09:43.872 –> 00:09:49.432
MAGGIE: So Fiona, why is it important that we start learning about AI now?

00:09:49.432 –> 00:09:52.452
MAGGIE: And this is just a general question, not role specific.

00:09:52.452 –> 00:09:57.832
MAGGIE: So all of us, everybody, why is it important that we start learning about it?

00:09:57.832 –> 00:10:01.032
FIONA: You know, this space is moving so quickly.

00:10:01.032 –> 00:10:05.692
FIONA: I don’t know about all of you, but I’m subscribed to many, many dozens of daily newsletters.

00:10:05.692 –> 00:10:09.212
FIONA: And I personally find it overwhelming to keep up and it’s my job to keep up.

00:10:09.832 –> 00:10:24.072
FIONA: So I think what I would just advise everyone here today is, given the pace of development in this space, you need to start now because every day that you wait, it just gets harder and harder.

00:10:24.072 –> 00:10:25.852
FIONA: And, you know, I get it.

00:10:25.852 –> 00:10:29.112
FIONA: For those of you out there who are EA’d, I have literally been in your role.

00:10:29.112 –> 00:10:32.292
FIONA: And I know it’s really tough to prioritize your own learning.

00:10:32.292 –> 00:10:37.532
FIONA: So like I’m not sitting here saying, you know, gosh, you have so much time sitting around twiddling your thumbs.

00:10:37.532 –> 00:10:38.992
FIONA: Like, why don’t you go out and do this?

00:10:38.992 –> 00:10:40.372
FIONA: Like, I really do get it.

00:10:40.372 –> 00:10:46.852
FIONA: But ironically, I think if you take on these AI tools, you’ll actually create capacity for yourself.

00:10:46.852 –> 00:10:49.032
FIONA: You know, so it’s like an investment.

00:10:49.032 –> 00:11:01.692
FIONA: You know, you need to invest a bit of time to wrap your mind around this, understand how it’s going to work for you in order to pay out dividends over the long run, which you’ll see in the time savings that you make on a daily and weekly basis when you’re using these tools.

00:11:02.892 –> 00:11:20.692
FIONA: And, you know, I think it’s also worth mentioning, too, that, you know, if you have extra capacity from using these fabulous tools, like it’s great to be able to use that to take on the more strategic work and the more human work, but it’s also great to use that spare capacity to just give yourself some work-life balance.

00:11:20.692 –> 00:11:29.172
FIONA: Most of the folks I’m working with, C-suite assistants out there, are working 60-hour-plus-week, every week, week-on-week, and that’s obviously not sustainable.

00:11:29.172 –> 00:11:35.412
FIONA: So, you know, another way to look at these tools is like, what a great opportunity for me to get a bit more balance in my life.

00:11:36.512 –> 00:11:37.912
MAGGIE: Yeah, I think that’s super helpful.

00:11:37.912 –> 00:11:43.092
MAGGIE: So diving in a little bit to what you said, you said it’s going to get harder and harder the longer we wait.

00:11:43.092 –> 00:11:47.952
MAGGIE: So what exactly is going to get harder about learning AI if we keep waiting to learn it?

00:11:53.807 –> 00:11:54.747
MAGGIE: Have you heard?

00:11:54.747 –> 00:12:00.307
MAGGIE: The Nova Chief of Staff Certification Course price is going up in January of 2025.

00:12:00.307 –> 00:12:04.087
MAGGIE: Enroll now and with lifetime access, start whenever you please.

00:12:04.087 –> 00:12:08.807
MAGGIE: Here at Nova, it’s our mission to provide the very best student experience possible.

00:12:08.807 –> 00:12:13.547
MAGGIE: Our course is chocked full of features and resources designed just for you.

00:12:13.547 –> 00:12:24.487
MAGGIE: Dozens of templates, self-paced online learning, hands-on practice, multiple instructor touch points, peer engagement opportunities, guest-authored assignments, the list goes on.

00:12:24.487 –> 00:12:31.667
MAGGIE: With 500 students across 22 countries, Nova is the premier destination for Chief of Staff Learning and Development.

00:12:31.667 –> 00:12:33.987
MAGGIE: Enroll today and join us.

00:12:37.927 –> 00:13:05.367
FIONA: Yeah, just given how many tools are coming out literally every day, and the pace of development of the tools that are already out there, I think the longer you wait to learn the fundamental skills around them, for instance, when we think about LLMs, large language models, so tools like ChatGPT, you may call them AI Chatbots, the basic skills of that tool are really around prompting.

00:13:06.507 –> 00:13:15.207
FIONA: Funny enough, that is the same skill that you need to use any AI component in any tool that is using GenAI within it right now.

00:13:15.987 –> 00:13:26.987
FIONA: For instance, if you’re using Notion as a way to gather and organize all of your documents in your life, well, you still need to know how to prompt the engine that sits within Notion.

00:13:26.987 –> 00:13:30.147
FIONA: It is really the same thing we did.

00:13:30.147 –> 00:13:41.847
FIONA: Those fundamental skills of how to use these tools, how to actually get value out of them will carry across the whole school’s landscape and will benefit you moving forward.

00:13:43.027 –> 00:13:48.207
FIONA: The sooner you can really get on that path to learn them, better.

00:13:48.847 –> 00:13:53.547
LAWRENCE: I think just to chime in on that, I think a little more optimistic flavor of what Fiona is saying.

00:13:53.987 –> 00:13:58.627
LAWRENCE: I agree with what she said is that you are still early enough.

00:13:59.487 –> 00:14:07.007
LAWRENCE: The foremost experts in generative AI applications like the actual tools build on top, they have about a year of experience.

00:14:08.007 –> 00:14:16.187
LAWRENCE: The fact that you are here on this webinar means that you have a chance to be the expert at your company in this emerging domain.

00:14:16.607 –> 00:14:18.567
LAWRENCE: Let’s not make any mistake here.

00:14:18.567 –> 00:14:20.307
LAWRENCE: This is the big one.

00:14:20.547 –> 00:14:22.247
LAWRENCE: I’m here in Silicon Valley.

00:14:22.247 –> 00:14:25.227
LAWRENCE: There have been two big waves in the last 20 years.

00:14:25.227 –> 00:14:28.407
LAWRENCE: One is the Internet itself, the second is mobile.

00:14:28.407 –> 00:14:30.827
LAWRENCE: This one is going to be bigger than all of them.

00:14:30.827 –> 00:14:44.587
LAWRENCE: I think that you’re here, there’s an opportunity to learn a skill set that’s going to stamp your ticket for the next 20 years of your career because I believe that the chiefs of staff that are fluent in AI are going to outcompete the ones that are not.

00:14:45.007 –> 00:14:47.987
LAWRENCE: It’s just an education process and the tools are out there.

00:14:48.527 –> 00:14:51.227
LAWRENCE: I agree with Fiona 100 percent.

00:14:51.227 –> 00:14:52.727
MAGGIE: Fiona, you mentioned prompting.

00:14:52.727 –> 00:14:56.847
MAGGIE: I just want to make sure that we’re just describing the terms that we’re using.

00:14:58.307 –> 00:15:07.427
MAGGIE: Before ChatGPT came out, I thought that there was this art and this science to putting the right thing into Google to find what I was looking for.

00:15:07.827 –> 00:15:10.667
MAGGIE: And that’s what prompting is in the ChatGPT space too, right?

00:15:10.667 –> 00:15:16.027
MAGGIE: It’s like, what do we put into ChatGPT to get back what we’re looking for and what we need?

00:15:16.027 –> 00:15:21.287
MAGGIE: I’m sure you can define it a little bit better, but do you want to just kind of close the loop on that?

00:15:21.287 –> 00:15:22.367
FIONA: Yeah, precisely.

00:15:22.367 –> 00:15:32.507
FIONA: So a prompt is simply the question that you’re putting into this tool or the instructions that you’re giving it, which then leads to, of course, giving you an output.

00:15:33.027 –> 00:15:41.227
FIONA: And so the richer that question, the more detailed, the more context you’re able to provide, the better the result that you’ll get out the back of it.

00:15:41.227 –> 00:15:47.567
FIONA: And so you can think of it as, like, imagine you’re talking to a five-year-old, you know, you’re going to really need to spell things out.

00:15:47.567 –> 00:15:56.587
FIONA: This is a, you know, this is a model that does not have background information on you and your very specific world and what you’re looking for.

00:15:56.587 –> 00:15:59.187
FIONA: So you really need to define that clearly.

00:15:59.187 –> 00:16:05.267
FIONA: And what I find so often with folks I work with is that is, oh, this stuff is so underwhelming.

00:16:05.267 –> 00:16:08.227
FIONA: Like I’ve used ChatPPT, it was rubbish.

00:16:08.227 –> 00:16:10.647
FIONA: And, you know, I ask people questions, well, how did you use it?

00:16:10.807 –> 00:16:13.347
FIONA: And walk me through, what did you put into it?

00:16:13.347 –> 00:16:23.187
FIONA: And yeah, I mean, I think it’s very easy to put a very simple question into these models and get a very simple answer that is really not aligned to what you’re looking for.

00:16:23.187 –> 00:16:26.607
FIONA: So it’s all about the nuance that you add into that prompt.

00:16:26.607 –> 00:16:29.227
FIONA: And also embracing a kind of a back and forth.

00:16:29.247 –> 00:16:37.627
FIONA: It is a chat bot, you’re designed to chat with it, and to have a back and forth conversation and not to take that first output as being the final one.

00:16:37.627 –> 00:16:41.667
FIONA: And I think that’s a really critical learning too in prompting.

00:16:41.667 –> 00:16:42.387
MAGGIE: Okay, awesome.

00:16:42.387 –> 00:16:44.727
MAGGIE: Yeah, thanks, I think that’s super helpful.

00:16:44.727 –> 00:16:52.647
LAWRENCE: Have you all heard the comparison between prompts and like spells, like a book of spells, like a grimoire is a book of spells.

00:16:52.987 –> 00:17:06.687
LAWRENCE: And I really like that because when you’re talking to these models, you don’t really know what’s going to come back and you have to use to Fiona’s point, exactly the right words and describe exactly what you want in a way that the machine can understand.

00:17:06.687 –> 00:17:09.587
LAWRENCE: And so it’s almost like you’re talking to this like magical being.

00:17:09.587 –> 00:17:13.327
LAWRENCE: So like the spell book comparison is really cool.

00:17:13.327 –> 00:17:14.647
MAGGIE: I love that.

00:17:14.647 –> 00:17:18.247
MAGGIE: Before we move on, anything else to add on this topic?

00:17:20.267 –> 00:17:25.247
MAGGIE: Okay, so let’s dive into some role specifics here.

00:17:25.247 –> 00:17:26.327
MAGGIE: I want to start with you, Jessica.

00:17:27.107 –> 00:17:32.247
MAGGIE: Why do EAs specifically need to jump on generative AI?

00:17:32.247 –> 00:17:34.407
MAGGIE: So let’s talk about the EA role.

00:17:34.407 –> 00:17:35.167
MAGGIE: Take your time.

00:17:35.167 –> 00:17:36.987
MAGGIE: Let’s dive into all of the things.

00:17:37.567 –> 00:17:38.747
MAGGIE: We’re on a 90-minute panel.

00:17:38.747 –> 00:17:43.647
MAGGIE: And just a reminder to the folks, we will get to questions, take them down.

00:17:43.647 –> 00:17:48.847
MAGGIE: We’ll run through questions towards the end of the panel, but we’ll keep this going here from a flow perspective.

00:17:48.847 –> 00:17:52.527
MAGGIE: So why EAs and generative AI, Jessica?

00:17:52.527 –> 00:17:53.447
JESSICA: Okay, there’s a lot.

00:17:54.187 –> 00:18:05.087
JESSICA: So for one thing, the administrative, we talked about a little bit earlier, why it’s important in general, why the people are learning is the administrative profession has been shrinking significantly.

00:18:05.087 –> 00:18:10.147
JESSICA: The average age of an executive assistant in North America is something like 49.

00:18:10.147 –> 00:18:16.167
JESSICA: We haven’t been able to get a lot of people really interested in this role for a lot of reasons.

00:18:16.167 –> 00:18:19.047
JESSICA: It’s kind of gotten to be a bit boring at times.

00:18:19.047 –> 00:18:26.127
JESSICA: And I think that by embracing AI, we can really shift the admin field in an entirely new direction.

00:18:26.127 –> 00:18:35.027
JESSICA: Because we have this massive set of generalist skills, and they’re really not being utilized by a lot of organizations to the best of their capabilities.

00:18:35.027 –> 00:18:39.527
JESSICA: And a lot of times the blocker is something as simple as I don’t have anybody to ask.

00:18:39.527 –> 00:18:43.727
JESSICA: When I was an executive assistant, I was the only executive assistant in the office.

00:18:43.727 –> 00:18:49.447
JESSICA: And that’s what led me to joining a bunch of different admin organizations, groups, Facebook groups even.

00:18:50.167 –> 00:18:56.387
JESSICA: And I found myself kind of mentoring even more than I was really asking for opinions.

00:18:56.387 –> 00:19:00.647
JESSICA: But people really needed help and guidance and didn’t have anybody to turn to.

00:19:00.647 –> 00:19:05.587
JESSICA: And I was an executive assistant right on the cusp of Chat GPT.

00:19:05.587 –> 00:19:07.727
JESSICA: I got laid off in February 2023.

00:19:07.727 –> 00:19:15.687
JESSICA: So I had started using Chat GPT just to see how it would solve problems for executive assistants in my communities.

00:19:16.287 –> 00:19:29.887
JESSICA: And I was recognizing very quickly what a great tool it was for things like templates, for standard operating procedures, for just working through a problem that you didn’t have anybody else to work through it with.

00:19:29.887 –> 00:19:36.327
JESSICA: There were a lot of ways that wouldn’t be as obvious to start using Chat GPT.

00:19:36.327 –> 00:19:52.187
JESSICA: But since it’s really become my go-to, I don’t use Google that much anymore because Google wasn’t helpful for me when my CEO came to me and asked me to create an engaging presentation with very little guidance on actually what to put in.

00:19:52.187 –> 00:19:56.467
JESSICA: I can’t go to Google and be like, tell me what I need to put in for a pitch deck.

00:19:56.467 –> 00:20:09.307
JESSICA: But I can go to Chat GPT and say, this is my company, this is what I do, this is the outcome that I want, help me create a pitch deck outline, and it’s going to create this step-by-step outline for me.

00:20:09.307 –> 00:20:15.167
JESSICA: This is why I talk about using it as a thought partner, is because I’m not going to Chat GPT and asking it to procure that data for me.

00:20:15.167 –> 00:20:18.987
JESSICA: I’m asking it, put my thoughts in an organized way.

00:20:18.987 –> 00:20:28.647
JESSICA: How can I take the knowledge that I have and transmute it and use Chat GPT to put it into a much more organized fashion?

00:20:28.647 –> 00:20:37.407
JESSICA: Thinking about using it that way versus I’m going to Chat GPT to get an answer is really where the magic is at, and where it becomes this incredible tool.

00:20:39.627 –> 00:20:50.467
JESSICA: I do think that the more and more we invest in learning and understanding AI, the more room there is to grow within your role and your space.

00:20:50.467 –> 00:20:50.767
MAGGIE: Okay.

00:20:50.767 –> 00:20:51.347
MAGGIE: Thank you.

00:20:51.347 –> 00:20:55.567
MAGGIE: Fiona, you also train assistants, so I want to give you an opportunity to jump in here.

00:20:55.567 –> 00:20:56.967
MAGGIE: Anything to add?

00:20:59.067 –> 00:21:14.087
FIONA: I think what Jessica said was brilliant, and I guess zooming out as a person who’s spent the last decade in learning and development and worked on a lot of change programs with people teams, HR teams, like I’m sure many of you in the audience would have done as well.

00:21:14.087 –> 00:21:21.827
FIONA: I would just say that my view is that AI may not take your job, but it will fundamentally change it.

00:21:21.827 –> 00:21:27.107
FIONA: And what I mean by that is it’s going to take a whole lot of repeatable admin off your plate.

00:21:27.107 –> 00:21:43.127
FIONA: So if you can just imagine, like in a year’s time, 50% of the time you’re currently spending in a calendar, 50% of the time you’re currently sending an e-mail just vanishes, it’s going to fundamentally, seismically change the shape of your role.

00:21:43.127 –> 00:21:59.987
FIONA: And so one thing that I really like to suggest for folks, you know, in an EA role, thinking about AI is lean into this stuff now so that you are able to rewrite your own job description because we are all going to have to rewrite our job descriptions, myself included.

00:21:59.987 –> 00:22:02.727
FIONA: You know, and don’t let her do that to you.

00:22:03.107 –> 00:22:11.747
FIONA: Be on the front foot thinking now about what could my role look like if literally half of what I do right now vanishes in a year.

00:22:11.747 –> 00:22:14.267
FIONA: You know, and I think that’s the mindset we all need to be in.

00:22:14.267 –> 00:22:19.067
FIONA: And certainly a lot of the reports I’m reading as well, you know, are showing similar things.

00:22:19.067 –> 00:22:27.347
FIONA: I think the WEC report that came out last year showed that something like 60% of people are going to need to reskill by 2027.

00:22:27.347 –> 00:22:29.407
FIONA: That is not far off, right?

00:22:29.407 –> 00:22:30.627
FIONA: That’s a few years time.

00:22:30.747 –> 00:22:33.927
FIONA: So that’s a pretty incredible statistic.

00:22:33.927 –> 00:22:39.527
FIONA: So yeah, start thinking now about how you’re going to retool and use the strengths you have, like Jessica mentioned.

00:22:39.527 –> 00:22:41.907
FIONA: We are amazing generalists.

00:22:41.907 –> 00:22:51.887
FIONA: We have such an incredible ability to see the bigger picture as well as work on the nitty-gritty and build relationships and do all the human stuff brilliantly.

00:22:51.887 –> 00:22:54.107
FIONA: You know, there’s a whole lot of things you can do with that.

00:22:54.107 –> 00:23:04.087
FIONA: It’s just all about diving in there, understanding the potential we have right now with AI and really thinking critically about what your role might look like and shaping it for yourself.

00:23:05.207 –> 00:23:11.747
MAGGIE: Yeah, that, you know, it just makes me think about how everybody in the C-suite, including executive assistants, are leaders.

00:23:11.747 –> 00:23:16.547
MAGGIE: And if all of our leaders aren’t adopting first, then we’re going to be out of the loop.

00:23:16.547 –> 00:23:20.767
MAGGIE: We’re not going to be able to guide the people on our teams and within our company.

00:23:20.767 –> 00:23:30.047
MAGGIE: So from a leadership perspective, it seems to make a lot of sense as well as a tactical and administrative and like reducing the workload and being able to think more strategically.

00:23:30.047 –> 00:23:37.147
MAGGIE: So Lawrence, let’s dive into why chiefs of staff specifically need to jump on generative AI.

00:23:37.147 –> 00:23:38.587
LAWRENCE: Yeah, I love what you just said.

00:23:38.587 –> 00:23:42.947
LAWRENCE: And I think it’s related to the unique aspects of the chief of staff role.

00:23:42.947 –> 00:23:49.667
LAWRENCE: I think that the chief just by definition is such a low ego role.

00:23:49.667 –> 00:23:56.347
LAWRENCE: They have to be thinking about how to push their organization forward, how to unlock their principle.

00:23:57.247 –> 00:24:01.387
LAWRENCE: The chief of staff is the queen or the king of the special project.

00:24:01.387 –> 00:24:11.287
LAWRENCE: And so what I’m hearing from the field is that there is no role that is better suited to usher in the next chapter of generative AI into a company than the chief of staff.

00:24:11.287 –> 00:24:13.747
LAWRENCE: They are made for this moment.

00:24:13.747 –> 00:24:15.507
LAWRENCE: So I think there are two lenses here.

00:24:15.507 –> 00:24:19.207
LAWRENCE: I think there’s a selfish lens, and I don’t mean this in a bad way.

00:24:19.207 –> 00:24:25.307
LAWRENCE: I think there’s a lens for everybody on this call is like, you need to prepare yourself for the next chapter of knowledge work.

00:24:25.847 –> 00:24:27.427
LAWRENCE: And that’s what we’re here for.

00:24:27.427 –> 00:24:28.887
LAWRENCE: We want our resumes to be better.

00:24:29.667 –> 00:24:31.187
LAWRENCE: We want to unlock more time.

00:24:31.187 –> 00:24:38.707
LAWRENCE: As Fiona says, we want to re-imagine, what could our role look like if we had 10 to 15 more hours a week to put into strategic work?

00:24:38.707 –> 00:24:41.887
LAWRENCE: Like how could we move the dial for ourselves, for our company?

00:24:41.887 –> 00:24:48.127
LAWRENCE: But then I think there’s a company-wide lens, which is companies compete with each other.

00:24:48.127 –> 00:24:58.987
LAWRENCE: And I think the ones that are quick to embrace the new skills and get hours back, a team of five doing the work of 20 because they’re using these tools, like that’s an advantage.

00:24:58.987 –> 00:25:07.487
LAWRENCE: So I think the chief has to think about both lenses, helping themselves, preparing themselves, but also helping their or get fluent with this new tech.

00:25:08.547 –> 00:25:10.867
MAGGIE: Yeah, that’s great.

00:25:10.867 –> 00:25:11.687
MAGGIE: I totally agree.

00:25:11.687 –> 00:25:13.587
MAGGIE: And I think that’s why everyone’s here on this call.

00:25:13.587 –> 00:25:22.427
MAGGIE: So whoever mentioned that you guys are all, I think Lawrence, it was you earlier, like everybody on this call is taking the right first step or wherever they are in their journey to learn more.

00:25:23.287 –> 00:25:25.467
MAGGIE: Figure out ways to adopt and understand the language.

00:25:25.467 –> 00:25:29.927
MAGGIE: So cheers to everybody for signing up for this webinar.

00:25:29.927 –> 00:25:30.287
MAGGIE: All right.

00:25:30.287 –> 00:25:35.167
MAGGIE: So let’s dive into tactically a little bit more.

00:25:35.167 –> 00:25:38.827
MAGGIE: How should assistants and chiefs of staff be using AI now?

00:25:38.827 –> 00:25:42.347
MAGGIE: So let’s talk about the things that we’re doing every day.

00:25:42.347 –> 00:25:49.127
MAGGIE: The ways that AI can help our work, the actual assignments and responsibilities that we’re doing in our job.

00:25:49.127 –> 00:25:53.407
MAGGIE: Shall we start with Chiefs of Staff, Lawrence, and go back to you?

00:25:53.707 –> 00:25:55.007
LAWRENCE: Yeah, sure.

00:25:55.007 –> 00:26:01.387
LAWRENCE: So I think this term rhythm of business is where the opportunity is right now.

00:26:01.387 –> 00:26:04.447
LAWRENCE: Chiefs of Staff have a lot on their plate.

00:26:04.447 –> 00:26:07.447
LAWRENCE: There’s things coming at them from all sides.

00:26:07.447 –> 00:26:11.147
LAWRENCE: So I’ll start with sort of like a high level metaphor of what AI can be for you.

00:26:11.347 –> 00:26:13.107
LAWRENCE: I think it can be a backstop.

00:26:13.107 –> 00:26:20.527
LAWRENCE: It can be an assistant that is helping you capture the notes while you run the meeting.

00:26:21.147 –> 00:26:35.407
LAWRENCE: It can be an assistant that makes sure that after you get done with 10 calls, that there’s a nice tight list of the action items that you have to follow up on so that you’re not dropping any balls, that you’re making sure that you’re keeping the trains running on time.

00:26:35.407 –> 00:26:48.327
LAWRENCE: So I think you can go right down the workflow of the Chiefs of Staff around calendar, around meeting notes, around action items, around projects, around creating decks, around writing memos and writing summaries.

00:26:49.227 –> 00:26:54.107
LAWRENCE: All of this stuff is right in the wheelhouse of AI today.

00:26:54.107 –> 00:26:58.867
LAWRENCE: There are startups working on every part of that flow.

00:26:58.867 –> 00:27:06.127
LAWRENCE: So I can share our little market map of the different lanes of a Chiefs of Staff and what are some companies and products to check out.

00:27:06.127 –> 00:27:08.327
LAWRENCE: I’ll share that after.

00:27:08.327 –> 00:27:13.387
LAWRENCE: But yeah, I think the way to think about it is how can I save time?

00:27:13.387 –> 00:27:17.627
LAWRENCE: How can I take the stuff at the bottom of my list that’s important but repetitive?

00:27:17.627 –> 00:27:20.947
LAWRENCE: And how can I automate that with generative AI?

00:27:20.947 –> 00:27:21.427
MAGGIE: Yeah.

00:27:21.427 –> 00:27:41.767
MAGGIE: And I think another component to everything that you said is then Chiefs of Staff driving that alignment on the leadership team and with an AI tool that is doing all the things you mentioned for all of the people on the leadership team and then kind of from a macro level summarizing the next steps and the priorities and the follow ups.

00:27:41.767 –> 00:27:46.847
MAGGIE: It just seems to compound and expand on itself the possibilities and the impact.

00:27:49.027 –> 00:27:52.667
MAGGIE: All right, so let’s dive in to EA specifically.

00:27:52.667 –> 00:27:53.827
MAGGIE: Fiona, let’s start with you.

00:27:53.827 –> 00:28:00.047
MAGGIE: So how should assistants be using AI right now tactically in their job with their current responsibilities?

00:28:01.507 –> 00:28:07.207
FIONA: Yeah, I mean, I’d love to start actually by just suggesting where to start, which is, in my view, dabble.

00:28:07.207 –> 00:28:12.647
FIONA: You know, I like to say the words Roger Scare, you’ve got to F around to find out.

00:28:12.987 –> 00:28:25.707
FIONA: You know, it’s not until you actually start trying out these tools and going through the kind of sometimes messy, chaotic process of testing them and working out which ones are good, where they’re going to work for you, where you’re going to get leverage in your workflow.

00:28:25.707 –> 00:28:30.147
FIONA: And by the way, which ones are completely crap or don’t work for you and don’t meet your needs.

00:28:30.147 –> 00:28:35.487
FIONA: And there’s a lot of crap tools out there at the moment because, you know, everybody is launching them daily.

00:28:35.487 –> 00:28:42.187
FIONA: And, you know, and some of them as well are not taking data privacy, infosec very seriously.

00:28:42.187 –> 00:28:45.527
FIONA: And so I think those are all things to be aware of as well.

00:28:45.527 –> 00:28:56.387
FIONA: I’d also suggest when you think about starting, like even if your company has restrictions on how you could use AI at work, there’s nothing stopping you from starting in your personal life.

00:28:56.387 –> 00:29:02.407
FIONA: And actually, I always suggest to folks, probably the best way to start using a tool like ChatPPP is using the phone app.

00:29:02.407 –> 00:29:07.287
FIONA: Because it is so intuitive, it has the most incredible voice recognition ever.

00:29:07.287 –> 00:29:12.567
FIONA: And when you use it in your personal life, you don’t have the fear that, oh gosh, am I going to put something in there?

00:29:12.567 –> 00:29:13.007
FIONA: I shouldn’t.

00:29:13.007 –> 00:29:17.627
FIONA: Am I going to get into hot water with my boss or my organization?

00:29:17.627 –> 00:29:21.727
FIONA: And obviously, discretion is really key as an executive assistant.

00:29:21.727 –> 00:29:32.027
FIONA: So I would say, as a starter, just try using this in your personal life because I think that’s a great way to understand the capabilities and also the limitations.

00:29:32.027 –> 00:29:36.847
FIONA: And then thinking more practically about specific ways you can use these tools.

00:29:37.527 –> 00:29:43.207
FIONA: So I like to think of it as in any sort of productivity tool that you might use at work.

00:29:43.207 –> 00:29:51.487
FIONA: So when you think about the big time sack, email, calendar, there are AI tools for those.

00:29:52.027 –> 00:29:57.507
FIONA: And in the email space, you’re looking at tools that are auto-composing your email for you.

00:29:57.507 –> 00:30:00.807
FIONA: In some cases, they are drafting them and leaving them in your draft.

00:30:00.827 –> 00:30:09.527
FIONA: Of course, they are filtering your emails and categorizing them as they come in and nudging you as well and all the good things like you need to reply to this.

00:30:09.527 –> 00:30:19.207
FIONA: For calendar, there’s great functionality out there that shortcuts bits and pieces of calendar management, but we’re absolutely not very set with autonomous calendar management.

00:30:19.207 –> 00:30:25.167
FIONA: I’d say the best tools are allowing you to, for instance, set up templates for different types of meetings.

00:30:25.167 –> 00:30:35.467
FIONA: So for instance, anytime I have a one-to-one, that’s a 30-minute by default, that’s classed as a medium priority, that needs zero buffer time.

00:30:35.467 –> 00:30:41.367
FIONA: Anytime I’m meeting with a client, that’s 60 minutes, that requires a 30-minute buffer time, and that’s always high priority, etc.

00:30:42.087 –> 00:30:47.787
FIONA: It really help move us towards that autonomous and strategic calendar management.

00:30:47.787 –> 00:31:04.027
FIONA: Then when you look at the broader suite of tools, I mean, it really runs from building a TrezDeck to design tools, to all-in-one tools like Notion and Coda, which are really trying to disrupt documents in your workflow.

00:31:04.027 –> 00:31:07.807
FIONA: Task and project management, one of my favorites is TapGate.

00:31:07.807 –> 00:31:10.207
FIONA: I use it all the time.

00:31:10.207 –> 00:31:20.487
FIONA: Then when you start to think about processes, tools like Scribe and Guide are able to capture SOPs by just recording their screen as you do something.

00:31:20.727 –> 00:31:31.707
FIONA: If you can imagine how easy and simple it would be to write up an SOP, just clicking it through, say the process that you book a vacation in your new HR tool, for instance.

00:31:31.707 –> 00:31:41.387
FIONA: I also think just diving in to TAT CPT specifically, like that tool is incredibly powerful for executive assistant.

00:31:42.587 –> 00:31:47.707
FIONA: I have a guide that includes 130 ways that you can use this in your workflow.

00:31:47.787 –> 00:31:49.707
FIONA: I’m not going to be able to cover all of that today.

00:31:50.187 –> 00:31:59.947
FIONA: I also know afterwards we’ll be sharing around an AI tool guide that I’ve created which has 50 tools I think are most useful for EAs right now, including my unbiased reviews.

00:31:59.947 –> 00:32:02.147
FIONA: Not sponsored by anyone.

00:32:02.147 –> 00:32:08.107
FIONA: I have no relationships with anyone, just my honest opinion on them and what’s good and what’s less good.

00:32:08.107 –> 00:32:13.607
FIONA: But yeah, very happy to dive in to specific tools if folks want to use the chat to just share what they’re most interested in.

00:32:13.607 –> 00:32:15.847
FIONA: I’m sure we’ll have time in the Q&A as well.

00:32:15.847 –> 00:32:16.847
MAGGIE: Yeah, I love that.

00:32:16.847 –> 00:32:24.087
MAGGIE: So a great reminder, Fiona, we will be sending everybody who’s registered kind of a few takeaways.

00:32:24.087 –> 00:32:27.167
MAGGIE: Fiona just mentioned a huge guide from her.

00:32:27.167 –> 00:32:29.307
MAGGIE: Jessica has a huge guide as well.

00:32:29.307 –> 00:32:38.727
MAGGIE: Lawrence is going to show the very awesome ambient tool built for Chiefs of Staff that if we can, we’ll share a meeting summary with you.

00:32:38.727 –> 00:32:46.487
MAGGIE: I don’t know if the webinar allowed our ambient tool in, but we’ll share a lot of information with you via email after this as well.

00:32:46.987 –> 00:32:49.707
MAGGIE: Thanks, Fiona, for reminding me about that.

00:32:49.707 –> 00:32:57.127
MAGGIE: I want to share just the way that I started dabbling, which is the first thing you said, just dabble into ChatGPT.

00:32:57.127 –> 00:33:01.467
MAGGIE: For those of you that are like, I just haven’t started yet, I don’t know what all those things are.

00:33:01.467 –> 00:33:04.667
MAGGIE: Like Fiona said, download ChatGPT, Google it.

00:33:04.667 –> 00:33:07.987
MAGGIE: There’s a web browser, also fine.

00:33:07.987 –> 00:33:15.407
MAGGIE: I had a few ingredients for a Thai curry, I think it was, several months ago at the end of last year.

00:33:16.087 –> 00:33:18.187
MAGGIE: And you can make Thai curry a lot of different ways.

00:33:18.187 –> 00:33:21.447
MAGGIE: But I didn’t have all the ingredients for like any recipe that would have popped up.

00:33:21.447 –> 00:33:25.567
MAGGIE: So what I said is, how do I make Thai curry with these ingredients?

00:33:25.567 –> 00:33:27.167
MAGGIE: I don’t have these.

00:33:27.167 –> 00:33:29.407
MAGGIE: And a recipe popped up for me and I followed it.

00:33:29.407 –> 00:33:32.287
MAGGIE: And it was the first time I’d followed a recipe from ChatGPT.

00:33:32.287 –> 00:33:35.047
MAGGIE: So thanks AI for that, for that use case.

00:33:35.047 –> 00:33:36.767
MAGGIE: But there’s so many, right?

00:33:36.767 –> 00:33:40.227
MAGGIE: Like, how do, what do I want to name my next child?

00:33:40.227 –> 00:33:41.667
MAGGIE: I want it to be four letters.

00:33:41.667 –> 00:33:42.927
MAGGIE: Give me 50 options.

00:33:42.927 –> 00:33:43.927
MAGGIE: I want an A in there.

00:33:44.067 –> 00:33:45.127
MAGGIE: I mean, it’s fun.

00:33:45.127 –> 00:33:46.887
MAGGIE: So pop around, play with it.

00:33:46.887 –> 00:33:52.047
MAGGIE: If nothing else, like, leave this call and just practice a little with something fun.

00:33:52.047 –> 00:33:55.067
MAGGIE: So Jessica, over to you.

00:33:55.067 –> 00:33:56.067
MAGGIE: Same question.

00:33:56.067 –> 00:34:02.927
MAGGIE: I want to hear, you know, anything that Fiona, you know, you want to add on specifically from an EA perspective.

00:34:02.927 –> 00:34:06.347
MAGGIE: How should assistance be using AI right now?

00:34:07.527 –> 00:34:13.007
JESSICA: Um, OK, so like Fiona talked about, there’s just a million and four ways that you can actually use it.

00:34:13.227 –> 00:34:18.767
JESSICA: Um, I think that I have like 22 different categories, um, in my course of just different areas.

00:34:18.927 –> 00:34:23.347
JESSICA: Um, for ChatGBT, I like to use it, like I said, as a starting place.

00:34:23.787 –> 00:34:25.907
JESSICA: Uh, in comedy, they have something called no bad ideas.

00:34:25.907 –> 00:34:28.047
JESSICA: So it’s like to kind of get ideas rolling.

00:34:28.047 –> 00:34:30.067
JESSICA: They’re just like no bad ideas, like say whatever you want.

00:34:30.067 –> 00:34:36.127
JESSICA: And I feel like that’s what it’s like to work with ChatGBT, is I don’t have to be embarrassed about, like, my silly little idea.

00:34:36.127 –> 00:34:40.307
JESSICA: I can just kind of talk through it and get all the steps that I need.

00:34:40.307 –> 00:34:44.007
JESSICA: Uh, and I think that’s one of the best ways to use it, is like I said, it’s a thought partner.

00:34:44.007 –> 00:34:47.147
JESSICA: It’s how do I start using it in my day-to-day?

00:34:47.147 –> 00:34:54.307
JESSICA: Well, I don’t know how to answer this email, so let me ask for this email and have it respond in a professional and polite tone.

00:34:54.307 –> 00:34:57.547
JESSICA: Or, you know, how do I write this awkward email on my behalf?

00:34:57.547 –> 00:35:00.607
JESSICA: And that’s like the most basic way to use it.

00:35:00.607 –> 00:35:23.187
JESSICA: But as I’ve grown my business over the past year, I really use ChatGBT as like a business coach, a business partner, and that like, if I don’t know how to write a proposal for something, it’s like, hey, here’s my course description, write a proposal for it, and it’ll, you know, put it into this like nice business format for me, so that I don’t have to worry about masquerading as a grown adult professional.

00:35:23.187 –> 00:35:27.027
JESSICA: I can just like have ChatGBT take it over from me.

00:35:27.027 –> 00:35:42.807
JESSICA: And I, a little bit ago, like in the question, somebody had said something about like, how do you deal with the security or like data concerns or things like that, is when I was learning ChatGBT, I was learning from a data analyst, and she used to work at Facebook or something.

00:35:42.807 –> 00:35:45.607
JESSICA: And she suggested what she called the Reddit rule.

00:35:45.607 –> 00:35:49.147
JESSICA: And her name is Rachel Woods, if anybody’s ever interested in looking her up.

00:35:49.147 –> 00:35:53.027
JESSICA: And if you wouldn’t put it into, this is for ChatGBT 3.5.

00:35:53.027 –> 00:35:55.747
JESSICA: So like, this is like no guarantees of data privacy.

00:35:55.747 –> 00:35:57.167
JESSICA: They’re training on your data.

00:35:57.167 –> 00:36:01.927
JESSICA: Is you can say, oh God, I lost my ADHD brain of thought.

00:36:01.927 –> 00:36:04.607
JESSICA: Shoot, that’s embarrassing.

00:36:04.607 –> 00:36:06.247
JESSICA: What was I saying?

00:36:07.347 –> 00:36:10.087
FIONA: Oh, you were just talking about Rachel Wood.

00:36:10.307 –> 00:36:10.747
<v SPEAKER_3>Thank you.

00:36:10.747 –> 00:36:11.527
JESSICA: Sorry, she was.

00:36:13.067 –> 00:36:14.287
MAGGIE: The chat is distracting.

00:36:14.287 –> 00:36:15.487
MAGGIE: Even though keep chatting, we love it.

00:36:15.487 –> 00:36:17.287
MAGGIE: But yes, as panelists have to work through that.

00:36:17.287 –> 00:36:18.467
JESSICA: It’s that ADHD brain.

00:36:18.467 –> 00:36:22.087
JESSICA: It just, they dissolve away like cotton candy.

00:36:22.087 –> 00:36:22.487
JESSICA: Yes.

00:36:22.487 –> 00:36:23.747
JESSICA: So she was at the Reddit rule.

00:36:23.747 –> 00:36:29.667
JESSICA: So basically, if you wouldn’t put it into an anonymous message board, then don’t put it into ChatGBT.

00:36:29.667 –> 00:36:34.287
JESSICA: You’re not going to go to ChatGBT and be like, my box’s social security number is X.

00:36:34.447 –> 00:36:36.667
JESSICA: Can you tell me what year he was born?

00:36:36.667 –> 00:36:47.387
JESSICA: You are going to ChatGBT and like you said, you’re asking for help on how to do something and you don’t need to give it, you don’t need to give it details about people’s personal lives or anything like that.

00:36:47.387 –> 00:36:55.007
JESSICA: Almost everything that I do in ChatGBT is take it out and put it into my own document form.

00:36:55.007 –> 00:37:00.507
JESSICA: So really just knowing that you don’t log on to ChatGBT and become a bad person.

00:37:00.747 –> 00:37:07.267
JESSICA: You don’t log on to ChatGBT and suddenly forget all the years of data privacy that you’re expecting for your boss.

00:37:07.487 –> 00:37:08.687
JESSICA: You’re still the same person.

00:37:08.687 –> 00:37:12.627
JESSICA: So you just utilize the tool in a responsible way.

00:37:12.627 –> 00:37:14.747
JESSICA: Sorry about the brain, guys.

00:37:14.747 –> 00:37:15.667
MAGGIE: I think it’s relatable.

00:37:15.667 –> 00:37:18.127
MAGGIE: People are loving it, so just don’t even worry.

00:37:18.167 –> 00:37:23.407
LAWRENCE: Can I share some hilarious apps to play with, just in the nature of playing with stuff?

00:37:23.407 –> 00:37:31.767
LAWRENCE: So Can of Soup, I don’t know if it’s available for everyone yet, but it’s like Instagram, but you decide what you describe what you want to see.

00:37:31.827 –> 00:37:35.327
LAWRENCE: So you will see yourself in like famous like Harry Potter movie scene.

00:37:35.327 –> 00:37:37.467
LAWRENCE: I mean, it’s hilarious.

00:37:37.467 –> 00:37:44.067
LAWRENCE: The second one is a Voiceify, where you can generate songs but sung by other famous voices.

00:37:44.067 –> 00:37:51.507
LAWRENCE: So if you want to hear Homer Simpson singing, Don’t Stop Believing, you just put in the YouTube link and they’ll crank out Homer singing songs.

00:37:51.507 –> 00:37:55.747
LAWRENCE: Then there’s one called aragon.ai, which is like AI powered headshots.

00:37:55.867 –> 00:38:01.147
LAWRENCE: So you’ll get like 200 ridiculous professional headshots of yourself in all moods.

00:38:01.147 –> 00:38:10.147
LAWRENCE: Like my LinkedIn photo is not actually me, it’s generated by a computer, and I have ones for different moods.

00:38:10.147 –> 00:38:15.947
LAWRENCE: Yeah, actually my headshot for this panel is also AI, so a couple to play with.

00:38:15.947 –> 00:38:16.607
MAGGIE: Very cool.

00:38:16.807 –> 00:38:25.127
MAGGIE: We’ll make sure to get those names from Lawrence and include them in our wrap-up summary email, and this chat transcript too, someone was asking about that.

00:38:25.207 –> 00:38:26.387
MAGGIE: So we’ll do our best there.

00:38:26.387 –> 00:38:29.387
MAGGIE: We will come back to kind of privacy and barriers.

00:38:29.387 –> 00:38:33.027
MAGGIE: I think we should have a good conversation around that.

00:38:33.027 –> 00:38:40.447
MAGGIE: But while we’re on the topic of role-specific, EA, Chiefs of Staff, I want to pause for a minute and talk about CEOs.

00:38:40.447 –> 00:38:54.247
MAGGIE: I don’t know if Lawrence, Fiona, Jessica, if either of any of you have a perspective on, is there anything different that a CEO should be doing from EAs and Chiefs of Staff as it relates to the use of AI?

00:38:54.947 –> 00:38:57.227
MAGGIE: Personally and for their company?

00:38:59.147 –> 00:38:59.907
MAGGIE: Lawrence, go for it.

00:38:59.907 –> 00:39:02.127
LAWRENCE: I definitely have opinions on this.

00:39:02.547 –> 00:39:16.647
LAWRENCE: I think the most important thing that a CEO or a founder or leadership can do, is to give their org the green light to educate themselves and to lean into this wave.

00:39:16.747 –> 00:39:23.527
LAWRENCE: I’ve seen lots of examples of CEOs that have just unlocked their org to get smart.

00:39:23.527 –> 00:39:30.227
LAWRENCE: But the CEO doesn’t have to be the top expert on AI in their company, but they have to give the green light that it’s okay to experiment.

00:39:30.227 –> 00:39:32.387
LAWRENCE: I think it’s that kind of leadership that’s needed.

00:39:32.387 –> 00:39:36.367
LAWRENCE: For the folks on the line, I know they have the ear of the CEO.

00:39:36.367 –> 00:39:44.787
LAWRENCE: I think that’s part of by meta adjacency, that is part of your role to convince them of the importance to lean into this.

00:39:44.787 –> 00:39:45.807
MAGGIE: Yeah, I love that.

00:39:45.807 –> 00:39:49.567
MAGGIE: We’ll include that in our summary and you guys can forward onto your leaders if you want.

00:39:51.207 –> 00:39:55.967
MAGGIE: Jessica or Fiona, anything to add around CEO use?

00:39:55.967 –> 00:39:58.047
JESSICA: I think that they need to get ahead of it.

00:39:58.047 –> 00:40:00.647
JESSICA: Why are we having to do this in the shadows?

00:40:00.647 –> 00:40:04.947
JESSICA: It’s the new version of going to the Internet for help.

00:40:06.047 –> 00:40:12.407
JESSICA: When the Internet became a thing in offices, we had to agree to Internet policies, the same thing in school.

00:40:12.407 –> 00:40:17.487
JESSICA: If that’s a concern for your organization, if you’re someone who’s a little nervous about it, get ahead of it.

00:40:18.127 –> 00:40:24.887
JESSICA: Put a policy in place and get everybody on the same page of how we’re going to do this.

00:40:24.887 –> 00:40:27.167
MAGGIE: Yeah, I think that’s great.

00:40:27.167 –> 00:40:30.087
MAGGIE: Fiona, anything from your end there?

00:40:30.087 –> 00:40:36.387
FIONA: I agree with what was just said, and I completely think that the tone comes from the top.

00:40:36.387 –> 00:40:54.947
FIONA: And I think that the risk of not moving on this stuff quickly is much, much greater than the risk of being slow and scrambling, trying to figure out what our policy is, whilst people are, by the way, using this stuff and not actually following good data privacy, policies or practices.

00:40:54.947 –> 00:40:57.207
FIONA: And so, yeah, I completely agree.

00:40:57.207 –> 00:41:03.347
FIONA: I think it’s just best to get ahead and also to think about, Jessica just mentioned some of the blockers right there.

00:41:03.347 –> 00:41:06.307
FIONA: I think we’ll probably have a separate discussion on all of that.

00:41:06.307 –> 00:41:11.807
FIONA: But you really do need to consider this as a whole change management process.

00:41:11.807 –> 00:41:13.647
FIONA: And there’s a lot of different steps to put in place.

00:41:13.707 –> 00:41:20.027
FIONA: This is not as simple as, we’re going to switch on Copilot within Microsoft and train people up on how to use the tool.

00:41:20.027 –> 00:41:21.607
FIONA: That’s actually the easy part.

00:41:21.607 –> 00:41:23.587
FIONA: It’s a huge mindset shift.

00:41:23.587 –> 00:41:25.047
MAGGIE: Yeah.

00:41:25.047 –> 00:41:26.167
MAGGIE: Thank you all for that.

00:41:26.167 –> 00:41:30.307
MAGGIE: I think it’s super helpful as we all work so closely with our principals.

00:41:30.307 –> 00:41:33.027
MAGGIE: And we are influential leaders with them and thought partners.

00:41:33.027 –> 00:41:39.767
MAGGIE: So I’m sure a lot of these conversations will transport themselves to the CEO.

00:41:39.767 –> 00:41:45.667
MAGGIE: OK, so we’re going to spend a little time talking about educational resources for the C-suite as it relates to AI.

00:41:45.667 –> 00:41:48.207
MAGGIE: We’re going to talk about what AI cannot replace.

00:41:48.207 –> 00:41:51.207
MAGGIE: And then we’re going to dive into the barriers that we’ve kind of brought up here.

00:41:51.207 –> 00:41:56.507
MAGGIE: Some privacy, some behavioral changes, the people that just don’t want to adopt and adapt.

00:41:57.967 –> 00:41:59.987
MAGGIE: And then we will definitely get to questions.

00:41:59.987 –> 00:42:02.247
MAGGIE: I think we’re good on time here for that.

00:42:02.247 –> 00:42:09.487
MAGGIE: So first, let’s start with educational resources for the C-suite as it relates to AI.

00:42:09.487 –> 00:42:10.307
MAGGIE: Jessica, let’s start with you.

00:42:10.427 –> 00:42:16.227
MAGGIE: Feel free to talk about the amazing products and tools that you each offer yourselves.

00:42:16.227 –> 00:42:20.807
JESSICA: So I think, like Fiona said earlier, the best way to learn is to just get in there and start doing it.

00:42:20.807 –> 00:42:29.727
JESSICA: But I also know that the first time you log on to ChatGPT, it’s a blank, blank, blinking screen, and it’s not always intuitive to know what to do.

00:42:29.727 –> 00:42:35.647
JESSICA: So that’s how I had gotten started, was just testing the different executive assistant scenarios.

00:42:35.647 –> 00:42:37.187
JESSICA: So it’s like, what do I need help on?

00:42:37.467 –> 00:42:44.867
JESSICA: I’m asked to make a standard operating procedure for using the company credit card and just see how it puts that out.

00:42:44.867 –> 00:42:52.547
JESSICA: We talk about what’s important for an executive assistant or a chief of staff to really have to be able to utilize these tools and stand out for the difference from them.

00:42:52.547 –> 00:42:55.487
JESSICA: It’s the emotional intelligence that comes along with it.

00:42:55.487 –> 00:43:04.287
JESSICA: I’m really good at using ChatGPT because I have read a lot, and I have a lot of media literacy, and I’ve consumed a lot of information.

00:43:04.287 –> 00:43:10.387
JESSICA: So the more information that I’m able to give it to work with and organize, the better it’s going to be able to be used.

00:43:10.387 –> 00:43:18.247
JESSICA: So really dig out your thesaurus and think about different words you can use to describe the way that you want to get a result.

00:43:18.247 –> 00:43:26.307
JESSICA: But it also challenges you in that way of continuous learning because you’re always thinking like, okay, how do I best describe this situation?

00:43:26.307 –> 00:43:28.207
JESSICA: How do I best ask for help?

00:43:28.687 –> 00:43:33.187
JESSICA: So it really helps you to learn to think through things even better.

00:43:34.307 –> 00:43:37.967
JESSICA: I think that’s the difference between a human and a computer, is that I’m continuously learning.

00:43:37.967 –> 00:43:41.907
JESSICA: I’m continuously finding different ways that I can utilize tools.

00:43:41.907 –> 00:43:46.047
JESSICA: And a tool is only a tool, it’s just going to sit there until you actually use it.

00:43:46.047 –> 00:43:51.247
JESSICA: So I don’t ever worry about it replacing our skill set for that aspect.

00:43:51.247 –> 00:43:51.807
MAGGIE: Yeah.

00:43:51.807 –> 00:43:52.647
MAGGIE: Okay.

00:43:52.647 –> 00:43:56.647
MAGGIE: Any educational resources that we haven’t touched on that you want to share?

00:43:56.647 –> 00:43:58.967
JESSICA: Yes, I do have a course, AI for Admins.

00:43:59.587 –> 00:44:03.767
JESSICA: It’s on February 28th from 12 Eastern to 3 Eastern.

00:44:03.767 –> 00:44:09.187
JESSICA: And we cover kind of like baseline, why is it important to understand AI?

00:44:09.187 –> 00:44:12.067
JESSICA: Why is it suddenly so important to talk about it?

00:44:12.067 –> 00:44:31.667
JESSICA: Because it just feels like it just kind of came out of nowhere for us last year, as well as a bunch of different tactical ways that you can use Chat GPT, which I like to teach on Chat GPT because once you understand how to use it, you will understand how all of these other programs like Microsoft Copilot can be utilized.

00:44:31.667 –> 00:44:36.607
JESSICA: It’s a very good foundation for understanding the capabilities of generative AI.

00:44:36.607 –> 00:44:43.207
JESSICA: So yes, Lindsay, message me and I will give you a discount because you took it previously and it’s improved.

00:44:43.207 –> 00:44:45.707
JESSICA: So I’m really excited to be doing it again.

00:44:45.707 –> 00:44:48.987
JESSICA: So you can check it out at techsavvysistent.com.

00:44:48.987 –> 00:44:49.267
MAGGIE: Cool.

00:44:49.267 –> 00:44:51.367
MAGGIE: And we’ll include all the details for that.

00:44:51.367 –> 00:44:52.667
MAGGIE: I think that’s going to be super helpful.

00:44:53.327 –> 00:44:53.707
MAGGIE: All right.

00:44:53.707 –> 00:44:59.707
MAGGIE: So Fiona, do you want to dive into educational resources as it relates to AI?

00:44:59.847 –> 00:45:02.347
MAGGIE: I know that Carve is a huge one.

00:45:03.527 –> 00:45:04.447
FIONA: Yeah, absolutely.

00:45:04.447 –> 00:45:15.287
FIONA: I mean, I would say, I guess, before I get into practical tips, just to take a step back, like, I think we should all be seeing these tools as a co-pilot and as a thought partner.

00:45:15.287 –> 00:45:20.347
FIONA: I hate to use Microsoft’s terms for their product, by the way, but I have to agree it could be one in this case.

00:45:21.267 –> 00:45:25.127
FIONA: But also as it is a tool to fill in for our own deficiencies.

00:45:25.127 –> 00:45:41.547
FIONA: And so I think the more you know yourself and you know the work that you hate doing, like, what is the stuff that you just let hang out on your to-do list for too long, that you have to really force yourself to get through, that feels like a flog, you know, or where you feel like it just takes you twice as long to do than it should.

00:45:41.547 –> 00:45:42.847
FIONA: And start there.

00:45:42.847 –> 00:45:48.227
FIONA: You know, I think that is a starting point for where you want to lean in, where you want to learn, where you want to discover tools.

00:45:48.567 –> 00:45:57.587
FIONA: For me, personally, it’s slide deck, you know, it was a year ago that I went out and said, what is the best AI slide deck creation tool out there that I can use?

00:45:57.587 –> 00:45:59.347
FIONA: Because I hate creating slides.

00:45:59.347 –> 00:46:11.067
FIONA: Like I didn’t have the sort of, you know, two years of working in a strategy, like management consultant role of just building PowerPoint 60 hours a week to teach me how to do it.

00:46:11.067 –> 00:46:14.787
FIONA: And I just hate doing them, you know, and it’s not something I want to master.

00:46:14.787 –> 00:46:19.487
FIONA: So that was the first tool that I went out and said, let me find a solution for this, for me.

00:46:19.487 –> 00:46:26.427
FIONA: I think that’s a good starting point is really that self-awareness, that self-knowledge is thinking about how could you partner with a tool?

00:46:26.427 –> 00:46:29.287
FIONA: What is going to give you the most leverage right now?

00:46:29.287 –> 00:46:38.767
FIONA: And then I think in a more general and practical sense, I would say commit to, you know, subscribing to say one daily AI newsletter.

00:46:38.827 –> 00:46:41.367
FIONA: I can share my favorites.

00:46:41.367 –> 00:46:52.147
FIONA: If you just commit to reading that every day and also commit to, as you’re listening to news, as you’re reading news, however you consume it, to try to find articles about AI and lean into those.

00:46:52.147 –> 00:46:59.587
FIONA: Try to understand the bigger picture issues around regulation, for instance, around copyright and intellectual property.

00:46:59.587 –> 00:47:04.847
FIONA: There’s a whole lot of issues in this space that are still being worked through.

00:47:04.847 –> 00:47:12.007
FIONA: And so I would say, don’t shy away from really leaning into those and consuming whatever media works for you.

00:47:12.007 –> 00:47:16.047
FIONA: I mean, for me personally, I love the coverage in the New York Times.

00:47:16.047 –> 00:47:22.347
FIONA: I love the coverage the BBC and Radio 4 and in their dedicated AI podcast has.

00:47:22.347 –> 00:47:30.967
FIONA: I love as well No Pliers, the podcast co-hosted by Elad Gill and Sarah Goh from Conviction.

00:47:30.967 –> 00:47:35.187
FIONA: So yeah, happy to share other tips, but practically commit to doing your learning.

00:47:35.267 –> 00:47:55.847
FIONA: And one thing that’s worked certainly for folks on my course is setting aside the time for it, like 15 minutes a week on a Friday, a time where you feel like, okay, this is non-negotiable and even if I have to shift it and move it, really try to stick to that and just be consistent in committing to your own learning, because you’re really committing to yourself.

00:47:56.227 –> 00:48:05.007
FIONA: And you’re right, I do have a course, a signature course called Carve AI, and this is really for executive assistants who want to go on this journey.

00:48:05.007 –> 00:48:24.007
FIONA: It’s a seven-week course, a couple of hours per week, where we go through everything from what is AI, how it can help me, what it means for the EA role moving forward, through to really going deep into chat, PPT and other AI chatbots to understand which is right for you, how to use them, how to get value out of them very quickly.

00:48:24.007 –> 00:48:38.127
FIONA: And then right the way through the AI tools landscape for assistance and through projects and work with peers, basically building your own AI tools stack, which is pretty powerful.

00:48:38.127 –> 00:48:41.987
FIONA: Very small group, program limited to 30 people.

00:48:41.987 –> 00:48:44.767
FIONA: I just launched actually a cohort this week.

00:48:44.767 –> 00:48:47.487
FIONA: And so the next one will be coming in the spring.

00:48:47.487 –> 00:48:53.367
FIONA: And I do also do a whole bunch of corporate workshops, speaking engagements and free events and stuff that you can join up.

00:48:53.367 –> 00:48:55.587
FIONA: Please definitely follow me on LinkedIn.

00:48:56.827 –> 00:48:57.307
MAGGIE: Awesome.

00:48:57.307 –> 00:48:58.307
MAGGIE: Thank you so much, Fiona.

00:48:58.307 –> 00:49:03.147
MAGGIE: Okay, Lawrence, from an educational resource standpoint, anything to add from your end?

00:49:04.267 –> 00:49:07.247
LAWRENCE: Yeah, I like everything that Jessica and Fiona said.

00:49:07.247 –> 00:49:13.527
LAWRENCE: I’m a Twitter guy, so it’s an incredible resource for following the people that are at the front line.

00:49:13.527 –> 00:49:20.247
LAWRENCE: So choose your favorite AI company, open AI, anthropic, hugging face, mid-journey.

00:49:20.247 –> 00:49:25.347
LAWRENCE: Follow the executives there, follow the engineers there, and some of them share a lot.

00:49:25.347 –> 00:49:27.327
LAWRENCE: One account I really like is Ethan Malik.

00:49:27.327 –> 00:49:34.127
LAWRENCE: He’s a professor at Wharton, and he just basically plays around with all these tools and analyzes what happens.

00:49:34.127 –> 00:49:36.527
LAWRENCE: And so I’ve discovered a bunch of new tools for him.

00:49:36.527 –> 00:49:43.907
LAWRENCE: So yeah, Twitter, if you’re a Twitter person, just follow Sarah Guo, Fiona mentioned, follow Clara Shi from Salesforce.

00:49:44.847 –> 00:49:47.407
LAWRENCE: There’s a whole list of folks you need to be following.

00:49:49.047 –> 00:49:49.827
MAGGIE: Awesome.

00:49:49.827 –> 00:49:50.087
MAGGIE: Okay.

00:49:50.087 –> 00:49:55.467
MAGGIE: Well, we will try to get as many tidbits into the summary email as well from all of these guys.

00:49:55.467 –> 00:50:00.567
MAGGIE: So let’s dive into what can AI not replace?

00:50:00.567 –> 00:50:06.267
MAGGIE: Some of the things are obvious, and I think it’s still relevant to have the conversation because there’s a lot that AI cannot replace.

00:50:07.707 –> 00:50:08.747
MAGGIE: Fiona, let’s start with you.

00:50:08.747 –> 00:50:19.467
MAGGIE: From an assistant perspective, what can an AI not replace that an executive assistant can master and there’s just no replacement for?

00:50:20.647 –> 00:50:22.647
FIONA: Yeah, I think really the human stuff.

00:50:22.647 –> 00:50:48.187
FIONA: So when you think about the human stuff in this role, it’s really sensing, it’s anticipating, it’s nudging, it’s giving feedback, and that kind of upward feedback that executives so rarely get, and when you’re a strategic partner, you’re really able, when you have built up that relationship, of course, of trust, you’re really able to give that feedback in a way that no one else can.

00:50:48.187 –> 00:50:56.987
FIONA: And I think as well, kind of solving complex problems on the fly, being a proactive problem solver is something that I think will be hard for AI to take on.

00:50:56.987 –> 00:51:01.447
FIONA: And then I guess the other big one is really around prioritization.

00:51:01.447 –> 00:51:09.747
FIONA: So when you think about super complex prioritization on a daily and weekly basis, how does this person spend their time?

00:51:09.747 –> 00:51:17.607
FIONA: And I supported an ultra high net worth individual who had more money than God, but literally did not have time.

00:51:17.607 –> 00:51:21.267
FIONA: And so that becomes the most important commodity for these people.

00:51:21.267 –> 00:51:30.567
FIONA: And I think what’s interesting when you’re an EA and also a chief is that you are really the only person in this executive’s life who sees the whole picture.

00:51:31.527 –> 00:51:33.147
FIONA: You see the company’s strategy.

00:51:33.147 –> 00:51:37.187
FIONA: You see the big challenges that are cropping up behind closed doors.

00:51:37.187 –> 00:51:39.007
FIONA: You also see their personal life.

00:51:39.007 –> 00:51:40.947
FIONA: You understand what’s really going on.

00:51:41.067 –> 00:51:45.987
FIONA: And oftentimes these folks are grappling with some really complex stuff outside of work too.

00:51:45.987 –> 00:51:48.667
FIONA: You understand their values really deeply.

00:51:48.667 –> 00:52:02.427
FIONA: And you also can see how they’re spending their time and be able to thought if that misaligned, if there’s a misalignment between those values and what their true priorities are and how they’re actually spending their time.

00:52:02.427 –> 00:52:27.567
FIONA: So yeah, and I would say all the other human touch stuff as well, like empathy and thoughtfulness, connection, like really simple stuff like, oh, there was a hurricane in Florida, and I know that this employee is from there originally, let me check in on their extended family, or all this client, you know, just had a baby, let’s send a bottle of champagne and bouquet of flowers.

00:52:27.867 –> 00:52:38.167
FIONA: You know, it’s those sorts of thoughtful touches that are actually so important for building long term relationships and trust, and that executive assistants are so good at doing.

00:52:38.167 –> 00:52:45.507
FIONA: I guess the final one, like in a face-to-face world, because we are getting back to that, I think, for most of us, is representing your exec.

00:52:45.507 –> 00:53:10.247
FIONA: So being that person who makes the first impression with VIPs who come through the door and who is able to represent your exec to employees, whether it’s new hires who come in or folks who maybe have never met your executive, you’re able to be that kind of face who can welcome them and represent them, really.

00:53:10.247 –> 00:53:11.707
MAGGIE: Yeah, those are great.

00:53:11.707 –> 00:53:16.747
MAGGIE: Jessica, since we’re on the topic of assistants, do you want to add in here?

00:53:16.747 –> 00:53:18.867
JESSICA: I really feel like Fiona definitely nailed it.

00:53:19.427 –> 00:53:36.587
JESSICA: The one example that I used, because at one point in time, there were some executives on LinkedIn talking about that this new AI tool was going to basically replace assistants all because it can do something like call and make a reservation for you.

00:53:36.587 –> 00:53:41.367
JESSICA: I just remember thinking that was such a simplistic idea of what an admin does.

00:53:41.367 –> 00:53:48.947
JESSICA: Because anybody can pick up a phone and call, it’s the fact that I know that you get really annoyed when a restaurant is loud.

00:53:48.947 –> 00:53:50.307
JESSICA: You want a quiet restaurant.

00:53:50.307 –> 00:53:56.347
JESSICA: I know that this is a city where you would drive, so I need to make sure that it’s going to have a parking lot at that restaurant.

00:53:56.347 –> 00:54:01.447
JESSICA: I know that the person that you’re meeting with is a vegan, so we need to make sure that there are a couple of vegan options.

00:54:01.447 –> 00:54:03.687
JESSICA: That’s the stuff that makes me who I am.

00:54:03.687 –> 00:54:05.367
JESSICA: It makes me great at my job.

00:54:05.367 –> 00:54:08.187
JESSICA: Not the fact that I can pick up a phone because honestly, I hate doing that.

00:54:08.187 –> 00:54:09.267
JESSICA: That’s the worst part.

00:54:09.267 –> 00:54:20.347
JESSICA: But if you know all these details about a person that they don’t even know, is that the people that we support aren’t always the most intuitive of people and they don’t always know themselves that well.

00:54:20.347 –> 00:54:32.247
JESSICA: So being able to pick up on these details and file them away and come back and use them when needed is the absolute difference of why AI won’t ever replace what we do.

00:54:32.247 –> 00:54:34.247
MAGGIE: Yeah, I think that’s great.

00:54:34.247 –> 00:54:39.667
MAGGIE: We will move along unless, Lawrence, you have anything that you want to add that Fiona and Justin didn’t touch on.

00:54:39.667 –> 00:54:47.527
LAWRENCE: Yeah, just speaking to the Chief of Staff perspective, I gave a presentation yesterday at the Chief of Staff’s Summit here in San Francisco.

00:54:47.527 –> 00:54:53.687
LAWRENCE: It was basically about how my previous Chief of Staff, what made him successful.

00:54:53.687 –> 00:54:56.227
LAWRENCE: And as I went down the list, I had seven things.

00:54:56.227 –> 00:54:58.547
LAWRENCE: I’m not going to go through them all.

00:54:58.547 –> 00:55:04.287
LAWRENCE: I’d say four of them are going to be not fully replaced by AI, but partially replaced.

00:55:04.287 –> 00:55:08.147
LAWRENCE: So I’ll focus on the things that I think are out of reach for AI.

00:55:08.147 –> 00:55:16.047
LAWRENCE: And so number one and number two on the list were, he had my trust and what do I mean by that?

00:55:16.047 –> 00:55:24.547
LAWRENCE: I knew his family, he knew my family, he knew my history, he knew personal things that I was struggling with.

00:55:24.547 –> 00:55:30.267
LAWRENCE: And so there’s something about that, that I think is out of reach for AI, at least for the foreseeable future.

00:55:30.267 –> 00:55:41.067
LAWRENCE: I know folks are working on sort of the mental health assistant, kind of sit on your shoulder kind of thing, and there may be possibly, but there’s a human connection there that is not going to be replaced anytime soon.

00:55:41.067 –> 00:55:47.087
LAWRENCE: The second big area was dealing with human conflict at my company.

00:55:47.087 –> 00:55:53.047
LAWRENCE: So we were 250 people and my chief of staff, Matt, would come to me and say, you know what?

00:55:53.047 –> 00:55:56.947
LAWRENCE: I think this guy is going to leave unless he gets some help.

00:55:56.947 –> 00:55:59.187
LAWRENCE: And it would have been catastrophic if that person left.

00:55:59.187 –> 00:56:12.647
LAWRENCE: So that the guy in my ear that I knew had the best interest of the company in mind was absorbing all of these dynamics across the whole company, was able to talk to me about very difficult things that nobody else could talk to.

00:56:12.647 –> 00:56:18.147
LAWRENCE: So the true consigliari kind of stuff, that’s not going to be replaced.

00:56:18.147 –> 00:56:22.867
LAWRENCE: And I think critical feedback, too, is I think feedback is a very human thing.

00:56:22.867 –> 00:56:27.427
LAWRENCE: And I think that not too many people are in a position to give the founder feedback.

00:56:27.427 –> 00:56:30.147
LAWRENCE: And I think chiefs can if they have that trust.

00:56:30.147 –> 00:56:32.487
LAWRENCE: So I do think I’m a techno-optimist.

00:56:32.727 –> 00:56:39.927
LAWRENCE: I think AI is going to come in and it’s going to unlock the people on this call to do bigger, greater, more strategic, more human things.

00:56:39.927 –> 00:56:43.707
LAWRENCE: And it’s going to take away the more admin stuff.

00:56:43.707 –> 00:56:47.827
LAWRENCE: And it’s just going to unlock us to be even better.

00:56:47.827 –> 00:56:49.487
LAWRENCE: So yeah.

00:56:49.487 –> 00:56:50.607
MAGGIE: Yeah, I love that.

00:56:50.607 –> 00:56:55.887
MAGGIE: I mean, I’m so glad you shared all those things because as a chief of staff, they ring very, very true.

00:56:55.887 –> 00:56:59.087
MAGGIE: And there’s just absolutely no way that anything could replace that.

00:56:59.967 –> 00:57:02.767
MAGGIE: I also think about the assistant example.

00:57:02.767 –> 00:57:11.867
MAGGIE: People like Jessica was saying how silly some of these things that have been said are, oh, well, now that we have AI, the assistant won’t be needed to order the thing.

00:57:12.427 –> 00:57:14.227
MAGGIE: So who’s going to do the ordering of the thing?

00:57:14.227 –> 00:57:16.467
MAGGIE: It’s going to just be through AI, you know?

00:57:16.467 –> 00:57:21.367
MAGGIE: But the assistant is the one who’s going to be in the driver’s seat, not the CEO.

00:57:21.367 –> 00:57:24.407
MAGGIE: They’re not going to be adapting a new tool and adding that to their workload.

00:57:24.407 –> 00:57:26.827
MAGGIE: The assistant is just changing the way they’re going to be working.

00:57:27.607 –> 00:57:39.227
MAGGIE: But I think these are such valuable pieces to consider of what AI absolutely can’t do and what we shouldn’t try to even approach AI as solving.

00:57:39.227 –> 00:57:49.307
MAGGIE: So let’s dive into some common barriers that prevent the use of or adaption of AI for C-suite users.

00:57:49.307 –> 00:57:52.387
MAGGIE: I know that we’ve got a few things on this list, right?

00:57:52.467 –> 00:58:00.147
MAGGIE: We’ve got the privacy concern, information security, behavioral changes, people not being bought in, general comfort around tech.

00:58:00.147 –> 00:58:04.127
MAGGIE: But I think we can have a bigger conversation and dive into a few of those specifically.

00:58:04.127 –> 00:58:05.407
MAGGIE: So Lawrence, let’s start with you.

00:58:05.407 –> 00:58:10.707
MAGGIE: I know that you are prepared to chat through some of the privacy and information security components.

00:58:10.707 –> 00:58:16.627
MAGGIE: You’ve built an AI tool that you’re having to really weigh all of these things with.

00:58:16.627 –> 00:58:17.107
LAWRENCE: Yeah.

00:58:17.287 –> 00:58:19.727
LAWRENCE: And so this is a big deal.

00:58:20.287 –> 00:58:25.487
LAWRENCE: I think there’s multiple levels of obstacles and some of them are real.

00:58:25.487 –> 00:58:33.947
LAWRENCE: And so I’ll give you a quick example of how Ambient works, just to give you a sense of some of the stress that we’re talking about here.

00:58:33.947 –> 00:58:38.327
LAWRENCE: So with Ambient, you summon an Ambient bot to your meeting.

00:58:38.327 –> 00:58:41.807
LAWRENCE: So right there, that triggers behavioral things like, why are you recording this?

00:58:41.807 –> 00:58:42.327
LAWRENCE: What is this?

00:58:42.327 –> 00:58:43.047
LAWRENCE: I don’t need a bot.

00:58:43.047 –> 00:58:43.947
LAWRENCE: I can take my own notes.

00:58:44.967 –> 00:58:46.227
LAWRENCE: We’ve never had a bot in here before.

00:58:46.227 –> 00:58:47.007
LAWRENCE: Why are we doing it now?

00:58:47.007 –> 00:58:51.867
LAWRENCE: So right off the bat, you’re sort of bumping into some behavioral issues.

00:58:51.867 –> 00:58:55.467
LAWRENCE: The next way the product works is it records the meeting.

00:58:55.467 –> 00:58:59.127
LAWRENCE: After the meeting is over, it generates a transcript.

00:58:59.127 –> 00:59:03.987
LAWRENCE: We wrap that transcript in our special Chief of Staff prompting.

00:59:03.987 –> 00:59:11.687
LAWRENCE: So from the lens of the Chief of Staff, we send multiple jobs with this transcript wrapped in our prompting to GPT-4.

00:59:11.687 –> 00:59:15.687
LAWRENCE: So it’s going to another third party, which is OpenAI.

00:59:15.687 –> 00:59:17.647
LAWRENCE: You can go directly to Microsoft Azure.

00:59:17.707 –> 00:59:19.267
LAWRENCE: I’ll talk about that in a second.

00:59:19.267 –> 00:59:21.987
LAWRENCE: We get back a series of assets.

00:59:21.987 –> 00:59:25.587
LAWRENCE: The agreement we have with OpenAI is they delete that content.

00:59:25.587 –> 00:59:31.027
LAWRENCE: They’re not allowed to use it to train their own models, but that’s a scary thing.

00:59:31.027 –> 00:59:33.007
LAWRENCE: How do we know they’re doing the right thing?

00:59:33.007 –> 00:59:41.407
LAWRENCE: We get back a series of assets and then we deliver you a list of next steps, action items, and a triage framework for managing your work.

00:59:41.747 –> 00:59:42.647
LAWRENCE: That’s the way it works.

00:59:42.647 –> 00:59:48.467
LAWRENCE: So right there, the people that we’re bumping into as obstacles, first of all, IT.

00:59:48.467 –> 00:59:55.787
LAWRENCE: How do we know our confidential transcript from our very important meeting board level?

00:59:55.787 –> 00:59:58.027
LAWRENCE: Imagine your executive leadership standard.

00:59:58.427 –> 01:00:01.887
LAWRENCE: How do we know that content is safe and not going to leak out on the internet?

01:00:01.887 –> 01:00:03.567
LAWRENCE: So that’s the first thing.

01:00:03.567 –> 01:00:07.367
LAWRENCE: The second level of obstacle is coming from legal.

01:00:07.367 –> 01:00:11.807
LAWRENCE: So what happens if my company gets sued 10 years from now?

01:00:11.807 –> 01:00:17.127
LAWRENCE: Could all of this transcript stuff be discovery in a lawsuit against us?

01:00:17.127 –> 01:00:19.767
LAWRENCE: You talk to Stripe and I know people at Stripe.

01:00:19.767 –> 01:00:23.607
LAWRENCE: They will not store any recording of any meeting more than two weeks.

01:00:23.607 –> 01:00:25.167
LAWRENCE: Some industries are tougher than others.

01:00:25.167 –> 01:00:29.107
LAWRENCE: Financial services is very locked down.

01:00:29.107 –> 01:00:35.587
LAWRENCE: And then the third level is just the behavioral stuff I’m talking about, which is like, I don’t know if you all remember Calendly.

01:00:36.027 –> 01:00:41.907
LAWRENCE: Two years ago, if you send a Calendly link to someone, there’s a 50-50 chance they’re going to get really mad.

01:00:41.907 –> 01:00:43.147
LAWRENCE: How dare you?

01:00:43.147 –> 01:00:43.807
LAWRENCE: I don’t work for you.

01:00:43.887 –> 01:00:45.967
LAWRENCE: How dare you send me a Calendly link?

01:00:45.967 –> 01:00:50.887
LAWRENCE: And then over time, most people realize that actually, this is going to save us time with a Calendly link.

01:00:50.887 –> 01:00:54.407
LAWRENCE: So I think there’s a little bit of that behavioral thing that needs to change.

01:00:54.407 –> 01:01:02.467
LAWRENCE: I also like, I’m old enough to remember when Gmail came out and people like, oh my God, Google’s reading all my email, like all that embarrassing personal stuff.

01:01:02.467 –> 01:01:07.307
LAWRENCE: I’m old enough to remember AWS, like I’m not going to store all my customer data at Amazon.

01:01:07.307 –> 01:01:09.267
LAWRENCE: And now most companies do.

01:01:09.267 –> 01:01:12.927
LAWRENCE: So I think some of this stuff is surpassable over time.

01:01:13.487 –> 01:01:15.227
LAWRENCE: I think some of it is real.

01:01:15.227 –> 01:01:22.727
LAWRENCE: And I think the overall dynamic that I’m seeing is that big, powerful, incumbent companies are being very careful.

01:01:22.727 –> 01:01:28.667
LAWRENCE: It’s the challenger companies, the startups that are just saying, you can save me 10 hours a week so I can do more.

01:01:28.667 –> 01:01:29.407
LAWRENCE: Let’s go.

01:01:29.407 –> 01:01:31.107
LAWRENCE: I’m going to roll the dice a little bit.

01:01:31.107 –> 01:01:32.567
LAWRENCE: And that dynamic, I’m an entrepreneur.

01:01:32.567 –> 01:01:33.267
LAWRENCE: So I love that.

01:01:33.267 –> 01:01:39.807
LAWRENCE: I love the hungry little companies taking chances to take down the dinosaur, or the giant, not always dinosaur.

01:01:39.807 –> 01:01:42.407
LAWRENCE: But yeah, I’ll pause there.

01:01:42.407 –> 01:01:43.107
MAGGIE: I love that.

01:01:43.107 –> 01:01:45.907
MAGGIE: I think it’s super helpful to think about it from that perspective.

01:01:45.907 –> 01:01:50.387
MAGGIE: As an AI builder yourself, you have this great insight that none of us do.

01:01:50.387 –> 01:01:52.927
MAGGIE: So there’s other barriers.

01:01:52.927 –> 01:02:04.187
MAGGIE: And a big one is that adaption, that person that’s like, I’m just not really into it, and that behavioral change that you might just never see.

01:02:04.387 –> 01:02:09.427
MAGGIE: So Fiona, you’re seasoned in training and development, lifelong leader in learning.

01:02:09.427 –> 01:02:15.447
MAGGIE: How do you work through behavioral change when it comes to adaption of a new technology specific to AI?

01:02:16.887 –> 01:02:17.847
FIONA: Great question.

01:02:17.847 –> 01:02:24.267
FIONA: I would say just to start with, I love Lawrence’s comments, and by the way, that stuff is really important.

01:02:24.267 –> 01:02:30.867
FIONA: Data privacy, Infosec, I mean, look what happened with social media when governments didn’t regulate that fast enough.

01:02:30.867 –> 01:02:33.427
FIONA: So I think regulation is going to be critical in this space, by the way.

01:02:34.287 –> 01:02:36.907
FIONA: To go into the behavioral side, absolutely.

01:02:36.907 –> 01:02:44.887
FIONA: I think right now, so many people are really underestimating the mindset shift that’s required for this.

01:02:44.887 –> 01:02:51.387
FIONA: And by the way, not the 200 of you who are here right now, you’re leaning into this, right?

01:02:51.387 –> 01:02:58.247
FIONA: But your colleagues, there will be a lot of your colleagues who are super negative, super cynical about this stuff.

01:02:58.247 –> 01:03:04.347
FIONA: And I think we can’t underestimate the amount of effort it’s going to take to move those people along.

01:03:04.347 –> 01:03:11.567
FIONA: And when you think about any sort of change management process, you have to have all your ducks in a row.

01:03:11.567 –> 01:03:14.507
FIONA: I’m not going to talk at length about that here.

01:03:14.507 –> 01:03:20.447
FIONA: But if you think about tactics like creating change agents, and by the way, maybe that will be you.

01:03:20.447 –> 01:03:22.947
FIONA: Maybe you will be a change agent in your organization.

01:03:22.947 –> 01:03:25.087
FIONA: That’s a pretty powerful thing, right?

01:03:25.087 –> 01:03:29.847
FIONA: Creating innovation teams of change agents to spread this stuff within teams.

01:03:30.487 –> 01:03:32.847
FIONA: Getting people on this journey.

01:03:32.847 –> 01:03:35.807
FIONA: I’ve been doing a lot of corporate workshops lately.

01:03:35.807 –> 01:03:47.067
FIONA: Even for some of the most forward thinking VC tech companies out there, there’s always that one or two people in an EA team who are just laggards or who are completely cynical about this.

01:03:47.067 –> 01:03:49.127
FIONA: I do understand their concerns.

01:03:49.127 –> 01:03:57.327
FIONA: But you have to identify who those people are, and you have to start to move them along a continuum, and see this as a continuum.

01:03:58.007 –> 01:04:02.247
FIONA: So some folks, like you out there, you are already ahead of the curve.

01:04:02.247 –> 01:04:03.547
FIONA: That’s amazing.

01:04:03.547 –> 01:04:15.467
FIONA: There’s going to be other folks where nudging them along is just getting them open-minded about their colleagues using this stuff, getting them to think, yeah, AI is not going to take your job in the next six months.

01:04:15.467 –> 01:04:17.047
FIONA: So don’t fear it so much.

01:04:17.047 –> 01:04:20.507
FIONA: Oftentimes, it’s fear that sits behind us, of course, the cynicism.

01:04:20.647 –> 01:04:33.007
FIONA: So digging a little bit deeper, understanding that, acknowledging that, working through that, using logic, as well as using emotion, I think is always a good way to connect with folks or a bit behind the curve.

01:04:33.007 –> 01:04:47.447
FIONA: But yeah, I guess my big one would just be that don’t underestimate the amount of work that needs to be done to get people going and on this journey and moving along this continuum, because it’s not about, oh, let’s train them up on this new with the tool that we brought in.

01:04:47.447 –> 01:04:48.807
FIONA: That’s the easy part.

01:04:48.807 –> 01:04:52.087
FIONA: Anyone can use a tool, anyone can learn that stuff.

01:04:52.367 –> 01:05:00.267
FIONA: It’s really the mindset shift for those 10, 15 percent of the organization who does not want to play ball on this stuff.

01:05:00.267 –> 01:05:00.987
MAGGIE: Yeah.

01:05:01.087 –> 01:05:04.247
LAWRENCE: Fiona, just to add, I think we have to be empathetic.

01:05:04.247 –> 01:05:13.307
LAWRENCE: I think the people that are building these models say that there’s a non-zero chance that this is going to end with the extinction of humanity.

01:05:13.307 –> 01:05:15.007
LAWRENCE: That is scary stuff.

01:05:15.007 –> 01:05:21.227
LAWRENCE: When we pivoted our company to work on AI, we had employees that were like, are we working on the bomb?

01:05:22.147 –> 01:05:24.927
LAWRENCE: How do we ethically get our arms around this?

01:05:24.927 –> 01:05:27.867
LAWRENCE: This is serious stuff.

01:05:27.867 –> 01:05:32.027
LAWRENCE: I landed in the techno-optimist perspective, like, let’s have the good guys working on this stuff.

01:05:34.247 –> 01:05:42.687
LAWRENCE: It’s a much longer discussion, but the term you’ll hear about in Silicon Valley is P-Doom, like the probability of doom coming from this technology.

01:05:43.327 –> 01:05:46.687
LAWRENCE: It’s scary and people are right to be nervous about this stuff.

01:05:47.967 –> 01:05:50.647
MAGGIE: How do you talk to the people who are nervous about this, Lawrence?

01:05:51.767 –> 01:06:01.367
LAWRENCE: Yeah, I mean, we spent and we did an off-site, we pulled together an off-site just on this topic, that it was so intense and personal for our team.

01:06:01.367 –> 01:06:10.747
LAWRENCE: And we just made the case for, again, the techno-optimist, which is this is not the first time humans have gone through a big transformational technology wave.

01:06:10.747 –> 01:06:15.167
LAWRENCE: And we’ve always found a way to carve out a space for us higher up in the food chain.

01:06:15.707 –> 01:06:19.787
LAWRENCE: It’s just the way humans have managed to thrive for thousands of years.

01:06:20.387 –> 01:06:28.327
LAWRENCE: So I think that’s where we landed in this concept of like, everyone’s going to be working on this anyway, we might as well bring an ethic to it.

01:06:28.327 –> 01:06:35.007
LAWRENCE: And so that we put our stamp on it of, this is how ethical people build AI tools.

01:06:35.007 –> 01:06:35.427
MAGGIE: Yeah.

01:06:35.427 –> 01:06:39.467
MAGGIE: And, you know, I appreciate this, this line of conversation.

01:06:39.467 –> 01:06:40.627
MAGGIE: I’m glad you brought it up, Lawrence.

01:06:40.627 –> 01:06:46.667
MAGGIE: So we did have someone just pop into the chat and talk about, but what about like the jobs?

01:06:46.667 –> 01:06:50.507
MAGGIE: Like, how do we not think about those jobs when we’re having this conversation?

01:06:50.507 –> 01:06:54.147
MAGGIE: So I want to make sure, you know, maybe Lawrence, you can continue on here.

01:06:54.147 –> 01:06:55.987
MAGGIE: What’s your, what’s your thought there?

01:06:55.987 –> 01:07:03.227
MAGGIE: How do you talk to someone who is concerned about job loss at a micro or macro level?

01:07:03.227 –> 01:07:10.447
LAWRENCE: Yeah, the studies that are coming out are pretty scary on this, just the percentage of jobs that are at risk.

01:07:10.447 –> 01:07:12.207
LAWRENCE: So that’s the pessimistic side.

01:07:12.207 –> 01:07:20.267
LAWRENCE: The optimistic side is I would say most of the tech leaders that I’m speaking to are not thinking about how can we eliminate jobs.

01:07:20.267 –> 01:07:22.707
LAWRENCE: It’s how can we do 3x the work?

01:07:22.707 –> 01:07:27.347
LAWRENCE: I mean, at least in the United States, it’s a growth mentality, which is grow, grow, grow, grow.

01:07:27.347 –> 01:07:31.207
LAWRENCE: And there’s good and bad with that, I think.

01:07:31.207 –> 01:07:41.847
LAWRENCE: But I think if you take a growth mentality to AI, it becomes like, let’s do the work of a team that’s 3x the size as opposed to let’s do the same work with one third of the people.

01:07:41.847 –> 01:07:52.567
LAWRENCE: So it’s sort of a simplistic answer, but it’s definitely something I think concepts like universal income, like I think we don’t even have a business model for this technology yet.

01:07:52.567 –> 01:07:55.567
LAWRENCE: Like look at the New York Times suing OpenAI.

01:07:55.567 –> 01:07:58.107
LAWRENCE: So it’s so early.

01:07:58.107 –> 01:08:01.247
LAWRENCE: There’s so many things that can still go wrong or go right.

01:08:01.247 –> 01:08:05.267
LAWRENCE: So it’s just, again, you all are very early to be on this call.

01:08:07.047 –> 01:08:09.047
MAGGIE: Fascinating for me here as well.

01:08:09.047 –> 01:08:11.747
MAGGIE: So thank you for allowing me to moderate this discussion.

01:08:11.747 –> 01:08:14.387
MAGGIE: Jessica, I want to pass it over to you.

01:08:14.387 –> 01:08:23.347
MAGGIE: What other barriers have you seen that exist that prevent the use of AI in kind of the conversations that you have and the teaching that you have and in your experience?

01:08:25.167 –> 01:08:25.607
JESSICA: Yeah.

01:08:25.607 –> 01:08:28.567
JESSICA: So a lot of fear around the idea of we’re going to lose jobs.

01:08:28.567 –> 01:08:32.887
JESSICA: And I loved that Lawrence brought up the idea of a universal basic income.

01:08:32.887 –> 01:08:36.767
JESSICA: I would love to discuss this if anybody ever wants to talk with me offline.

01:08:36.767 –> 01:08:40.207
JESSICA: I think that we need to imagine a world that isn’t like ours.

01:08:40.207 –> 01:08:41.287
JESSICA: Societies change.

01:08:41.287 –> 01:08:43.227
JESSICA: We’re watching a lot of changes right now.

01:08:44.307 –> 01:08:48.507
JESSICA: We’re going to see a giant shift in the way that humanity operates.

01:08:48.507 –> 01:08:50.007
JESSICA: And we don’t look at it.

01:08:50.007 –> 01:08:52.747
JESSICA: I don’t look at it as, you know, oh, we’re going to lose jobs.

01:08:52.747 –> 01:08:55.567
JESSICA: I’m going to say things, other roles are going to open up.

01:08:55.567 –> 01:08:57.787
JESSICA: There’s going to be other opportunities.

01:08:57.787 –> 01:09:03.927
JESSICA: You know, this isn’t the first time the administrative industry has had a shift in technology.

01:09:03.927 –> 01:09:11.027
JESSICA: It started with like writing things manually, and then it became, you know, typewriters and the computers and, you know, all this additional technology.

01:09:11.167 –> 01:09:15.207
JESSICA: And we became the beacon of technology in our organizations.

01:09:15.207 –> 01:09:21.667
JESSICA: How many of us became the de facto tech person when COVID hit, and we had to roll out remote work to everybody?

01:09:21.667 –> 01:09:24.887
JESSICA: I know that it wasn’t my job at the time to be doing it, but I was.

01:09:24.887 –> 01:09:30.167
JESSICA: So we’re going to just see a shift in the way that the world is working.

01:09:30.167 –> 01:09:34.667
JESSICA: And I always try and come at it from an empathetic point of view, because people are scared.

01:09:34.667 –> 01:09:35.387
JESSICA: And I get it.

01:09:35.387 –> 01:09:37.387
JESSICA: Like most of us, a lot of us live in North America.

01:09:37.387 –> 01:09:39.187
JESSICA: There’s not a lot of safety nets available.

01:09:39.367 –> 01:09:42.487
JESSICA: So the idea of losing your job is terrifying.

01:09:42.487 –> 01:09:51.707
JESSICA: But the best thing that you can do right now is empower yourself with AI, because burying your head in the sand isn’t going to save your job.

01:09:51.707 –> 01:09:54.907
JESSICA: It isn’t going to give you the skills to get a new job.

01:09:54.907 –> 01:09:59.167
JESSICA: It’s only going to cause you problems in your career.

01:09:59.167 –> 01:09:59.927
JESSICA: And I get it.

01:09:59.927 –> 01:10:07.607
JESSICA: I have, you know, some older people in my community, and they’ll say things like, I don’t really want to learn more skills at this point in my career.

01:10:07.607 –> 01:10:08.547
JESSICA: Like, I’m tired.

01:10:08.887 –> 01:10:11.227
JESSICA: I’m like, I am always tired.

01:10:11.227 –> 01:10:11.847
JESSICA: I get it.

01:10:11.847 –> 01:10:19.387
JESSICA: So like, I just need them to get to a place of saying, like, I’m comfortable learning.

01:10:19.387 –> 01:10:25.147
JESSICA: And unfortunately, I don’t believe that the admin field has been super empowered in the learning aspect.

01:10:25.147 –> 01:10:31.967
JESSICA: I don’t think that we, a lot of us have been given, you know, L&D development budgets so that we can go out and learn.

01:10:31.967 –> 01:10:34.987
JESSICA: So we’ve been kept very small in that aspect.

01:10:34.987 –> 01:10:50.687
JESSICA: And something that I speak to in some of my presentations is how when AI first, when Chat GPT, for example, first came out, a lot of reports were coming out about how it was going to take a bunch of admin jobs and it was going to take a bunch of software engineering jobs because it can write code, right?

01:10:50.687 –> 01:11:00.407
JESSICA: But what you’ll see within software engineering and with tech in general is it’s a very collaborative industry and they’re not unfamiliar with sharing tools.

01:11:00.407 –> 01:11:03.407
JESSICA: There’s things like GitHub where they’ve shared code for years.

01:11:03.407 –> 01:11:06.587
JESSICA: But we don’t have that same type of mentality in admin.

01:11:06.647 –> 01:11:11.627
JESSICA: We don’t have this collaborative like, let me share my resources really freely and easily.

01:11:11.627 –> 01:11:16.547
JESSICA: We’re not used to the idea of using a tool to empower our career and leaning on others.

01:11:16.547 –> 01:11:18.627
JESSICA: We’ve all been like, we have to do this ourselves.

01:11:18.627 –> 01:11:22.747
JESSICA: If I’m the only person I can rely on, we have probably like oldest daughter syndrome.

01:11:22.747 –> 01:11:32.107
JESSICA: It really needs to be focused on how can I use this to boost my career and use this as my partner versus it’s going to take my job.

01:11:32.107 –> 01:11:36.207
JESSICA: It’s not you versus AI, it’s AI versus corporations.

01:11:37.447 –> 01:11:43.507
MAGGIE: I absolutely appreciate that last part of what you’re just saying, Jessica, very, very much resonates.

01:11:43.547 –> 01:11:46.987
MAGGIE: I’m really excited that we’re going to mostly leave it there before Q&A.

01:11:46.987 –> 01:11:56.307
MAGGIE: The last thing that I want to ask each of you is one to two sentences about where AI is going in the next five years.

01:11:56.307 –> 01:11:59.567
MAGGIE: I will start so you can think about it briefly.

01:12:00.707 –> 01:12:12.627
MAGGIE: Where I am going with AI is going to be basically continuing testing, learning, growing, and working through some discomfort of a non-AI expert.

01:12:12.627 –> 01:12:14.527
MAGGIE: Let’s move it to Fiona.

01:12:14.527 –> 01:12:20.227
MAGGIE: What’s your one to two sentence recap of where we’re going over the next five years with AI?

01:12:20.227 –> 01:12:53.247
FIONA: I think as we become more comfortable with AI, with the idea of it and with the tools themselves and using them, we will allow it to take over more and more, an increasing amount over the next five years, because we don’t want to have a crystal ball, more and more of our role, and more and more of the stuff that we never liked to do anyway, and free up ourselves and our capacity to do more of the really powerful and strategic and value add work, and the stuff that only humans can do and humans can do best.

01:12:54.967 –> 01:12:55.407
MAGGIE: Awesome.

01:12:55.407 –> 01:12:57.067
MAGGIE: Lawrence.

01:12:57.067 –> 01:13:02.567
LAWRENCE: Yeah, this is going to sound a little intense, but we are building for a world.

01:13:02.567 –> 01:13:14.187
LAWRENCE: So the context is these models are improving so fast, that what they look like today, which they look pretty good, is going to be nothing like they look like in two years.

01:13:14.187 –> 01:13:28.467
LAWRENCE: So we are building for a future where the models will provide God-like accuracy, God-like attention to detail, God-like precision, God-like judgment.

01:13:28.467 –> 01:13:36.467
LAWRENCE: So as you use these tools today, don’t get irritated by, oh, it didn’t quite get that thing right, or it’s not as good as I could have done it.

01:13:36.467 –> 01:13:42.167
LAWRENCE: Squint your eyes and imagine six months from now, with the rate of progression that things are happening.

01:13:42.167 –> 01:13:44.967
LAWRENCE: So I’m going to talk only from a B2B perspective.

01:13:44.967 –> 01:13:50.507
LAWRENCE: We can have a separate panel on the implications on society and it sounds like Jessica wants to join that one too.

01:13:50.507 –> 01:14:02.627
LAWRENCE: But from a B2B perspective, from a business perspective, I think we are heading towards a world with zero mistakes, zero drop balls, flawless execution.

01:14:02.627 –> 01:14:14.607
LAWRENCE: Then it’ll come down to things like vision and market, luck, but failed execution, drop balls, I think that will be a thing of the past.

01:14:14.607 –> 01:14:15.047
MAGGIE: Okay.

01:14:15.047 –> 01:14:15.687
MAGGIE: Thank you, Lawrence.

01:14:15.947 –> 01:14:18.127
MAGGIE: Jessica, we’ll end with you.

01:14:18.127 –> 01:14:21.127
JESSICA: I think we’re going to see a really dynamic shift in the admin field.

01:14:21.127 –> 01:14:33.047
JESSICA: I think this is going to be the opportunity to branch out from being just support to strategic on a regular basis, and it’s really going to empower a lot of people to take their careers places where they didn’t think they could go.

01:14:33.047 –> 01:14:36.827
JESSICA: Maybe they’re feeling burned out at the current situation, they’re bored.

01:14:36.827 –> 01:14:39.347
JESSICA: And you can use ChatGPT or you can use AI.

01:14:39.347 –> 01:14:40.787
JESSICA: Sorry, I always say ChatGPT.

01:14:40.787 –> 01:14:48.267
JESSICA: AI to be your thought partner and really just expand what you’re doing and change your career path.

01:14:48.267 –> 01:14:48.847
MAGGIE: Fabulous.

01:14:48.847 –> 01:14:49.247
MAGGIE: Okay.

01:14:49.247 –> 01:14:58.247
MAGGIE: Well, anybody, let’s see if you have Q&A, it’s the easiest way to do this on a webinar is to pop it into the Q&A section of this webinar.

01:14:58.247 –> 01:15:04.547
MAGGIE: I’m not exactly what that looks like on your end, but there’s a Q&A area probably down at the bottom.

01:15:04.547 –> 01:15:06.287
MAGGIE: A little easier than going back through the chat.

01:15:06.287 –> 01:15:09.287
MAGGIE: So I’m going to read through some of these questions.

01:15:09.287 –> 01:15:11.727
MAGGIE: And we’re going to do it a little bit more.

01:15:11.727 –> 01:15:16.387
MAGGIE: I don’t want to say rapid style, but a little bit more question, answer, question, answer.

01:15:16.387 –> 01:15:22.507
MAGGIE: And I will call out the panelists who I would like to answer for each of these questions.

01:15:23.587 –> 01:15:35.367
MAGGIE: Jessica, do you think AI will put a strain on junior EAs or like admin associates to develop a strategic lens in their role a lot quicker than they may have before?

01:15:35.367 –> 01:15:36.627
JESSICA: I don’t think it’s a strain.

01:15:36.627 –> 01:15:45.287
JESSICA: I think that’s a major positive is that, you know, the idea of an entry-level job has really gone the wayside.

01:15:45.287 –> 01:15:51.387
JESSICA: It’s like by the time you’re starting an admin role, they kind of expect you to jump in like full speed.

01:15:51.387 –> 01:15:53.647
JESSICA: I don’t think you get a lot of like time to gear up.

01:15:53.647 –> 01:15:58.727
JESSICA: So using AI, that’s just going to make you be able to scale up faster.

01:15:58.727 –> 01:16:06.047
JESSICA: Is you’re not having to, you know, wait, send an IM and wait for somebody to respond about, you know, some question.

01:16:06.047 –> 01:16:08.447
JESSICA: Is that you’ll be able to get an answer a lot faster.

01:16:08.587 –> 01:16:11.787
JESSICA: It’s like, how much time did I spend at my jobs in the past?

01:16:11.787 –> 01:16:14.107
JESSICA: Just like dawdling, waiting for a response.

01:16:14.107 –> 01:16:18.287
JESSICA: It’s, you know, you can get instant advice or instant recommendations.

01:16:18.287 –> 01:16:26.727
JESSICA: You don’t have to deluge your superior with questions because you can get a general idea and then you can go and work through it elsewhere.

01:16:26.727 –> 01:16:29.787
JESSICA: So it’s going to change the way that like you’re working.

01:16:29.787 –> 01:16:32.547
JESSICA: But I don’t see that that’s a negative at all.

01:16:32.547 –> 01:16:34.387
MAGGIE: Yeah, I appreciate that.

01:16:34.387 –> 01:16:38.027
MAGGIE: Lawrence, we were just on the topic around dangers and extinction from AI.

01:16:38.447 –> 01:16:41.707
MAGGIE: What are the obligations of our leaders and our company leaders?

01:16:41.707 –> 01:16:47.747
MAGGIE: Do you see companies taking a stance for or against AI adoption, or is it a little bit more passive than that?

01:16:48.867 –> 01:16:53.247
LAWRENCE: Yeah, and what’s missing there is government, the responsibility of government.

01:16:53.247 –> 01:16:59.267
LAWRENCE: And so you’re seeing Europe put in a lot more guardrails as they do.

01:16:59.267 –> 01:17:04.707
LAWRENCE: And we can debate like what’s the right balance between growth and kind of guardrails.

01:17:06.407 –> 01:17:20.987
LAWRENCE: Just in terms of the lay of the land from the model perspective, the company Anthropic, which is one of the rivals to OpenAI, they’ve famously taken a safety first perspective, where they used to work for OpenAI.

01:17:20.987 –> 01:17:28.167
LAWRENCE: The founders of Anthropic left OpenAI about a debate of how fast they should allow this tech to move.

01:17:28.167 –> 01:17:33.047
LAWRENCE: And so they’re trying to build for a future of ethical AI, a little more conservative.

01:17:34.207 –> 01:17:38.707
LAWRENCE: That debate between the do-mers and the growth is not over.

01:17:38.707 –> 01:17:43.667
LAWRENCE: Like if you followed at all what happened at OpenAI, I was working right across the street with a bunch of startups.

01:17:43.667 –> 01:17:46.987
LAWRENCE: And we saw that one day none of the cars came into OpenAI’s parking lot.

01:17:46.987 –> 01:17:47.367
LAWRENCE: Why?

01:17:47.367 –> 01:17:59.327
LAWRENCE: Because there was a coup going on and there was a battle at the board level between the conservatives and not politically conservative, the ones that wanted to go slower and the ones that wanted to go faster.

01:17:59.367 –> 01:18:04.267
LAWRENCE: And initially, the ones that want to go slower one and then the people that wanted to go faster one.

01:18:04.267 –> 01:18:06.467
LAWRENCE: So it’s all playing out right now.

01:18:06.467 –> 01:18:11.987
LAWRENCE: So yes, I believe that folks building these tools need to be ethical about the way they’re doing that.

01:18:11.987 –> 01:18:16.147
LAWRENCE: And I think there’s a lot of work to be done to figure out what does that even mean?

01:18:16.147 –> 01:18:18.667
LAWRENCE: I think governments have to play a role.

01:18:18.667 –> 01:18:21.347
LAWRENCE: And I think we all want growth to some extent.

01:18:21.347 –> 01:18:22.307
LAWRENCE: We all want jobs.

01:18:22.667 –> 01:18:25.627
LAWRENCE: We all want GDP, I think.

01:18:25.627 –> 01:18:27.407
LAWRENCE: So yeah, it’s above my pay grade.

01:18:27.907 –> 01:18:32.527
LAWRENCE: The question is a big one, but I think everybody needs to kind of take this stuff seriously.

01:18:32.527 –> 01:18:32.947
MAGGIE: Okay.

01:18:32.947 –> 01:18:34.467
MAGGIE: Thank you, Lawrence.

01:18:34.467 –> 01:18:37.007
MAGGIE: This question is for Fiona and Jessica.

01:18:37.007 –> 01:18:39.487
MAGGIE: And I’m interpreting it a little bit differently for the audience.

01:18:39.487 –> 01:18:48.307
MAGGIE: But what would be your two top tips, two top tips for prompting in specifically ChatGPT?

01:18:48.307 –> 01:18:49.587
MAGGIE: Fiona first.

01:18:50.887 –> 01:18:54.087
FIONA: So yeah, I think I kind of covered this actually earlier.

01:18:54.087 –> 01:19:02.187
FIONA: Just to recap, I think the context, giving it as much context and detail as possible, would be my number one top tip.

01:19:02.187 –> 01:19:05.627
FIONA: And actually, I have a whole framework for prompting called QOP.

01:19:05.627 –> 01:19:07.247
FIONA: And context is the first piece of that.

01:19:07.247 –> 01:19:11.007
FIONA: So I can share that with you later if you want to message me on LinkedIn.

01:19:11.007 –> 01:19:13.807
FIONA: And the second tip I would say is iterate.

01:19:13.807 –> 01:19:20.287
FIONA: So really, you know, embrace this back and forth chat dialogue with the tool.

01:19:20.287 –> 01:19:22.687
FIONA: And this is known as chain of thought prompting.

01:19:22.687 –> 01:19:25.247
FIONA: So you may have seen it written as such around the Internet.

01:19:25.827 –> 01:19:31.887
FIONA: And, you know, just don’t settle for that first response it gives you being the final one.

01:19:31.887 –> 01:19:32.367
MAGGIE: I love that.

01:19:32.367 –> 01:19:35.067
MAGGIE: Jessica, you’re welcome to answer that same question.

01:19:35.067 –> 01:19:39.387
MAGGIE: Or maybe your two favorite prompts if you have some fun ones.

01:19:39.387 –> 01:19:41.367
JESSICA: No, I mean, because everything is so situational, right?

01:19:41.367 –> 01:19:44.767
JESSICA: But yeah, I would say, like Fiona said, just making sure that you’re being very descriptive.

01:19:44.767 –> 01:19:54.147
JESSICA: I found early on that because I am neurodivergent and I have a habit of over explaining things as like a baseline, that that’s why I would get really good responses from Chat GPT.

01:19:54.147 –> 01:19:56.927
JESSICA: So making sure, like I said, that you’re giving it a lot of detail.

01:19:56.927 –> 01:19:59.747
JESSICA: And then also telling it, are you doing a good job?

01:19:59.747 –> 01:20:00.747
JESSICA: Are you doing a bad job?

01:20:00.747 –> 01:20:08.867
JESSICA: Is really good when you’re like iterating on it and saying like, that was good, continue doing that, or I didn’t like this, change this.

01:20:08.867 –> 01:20:12.827
JESSICA: Is you can continue to tweak the way that it will respond to you.

01:20:14.267 –> 01:20:14.667
MAGGIE: Awesome.

01:20:14.667 –> 01:20:16.367
MAGGIE: So Jessica, this one’s back to you.

01:20:16.367 –> 01:20:20.987
MAGGIE: A question I get stuck on when I’m at work is, was that you or was that Chat GPT?

01:20:21.627 –> 01:20:29.407
MAGGIE: How do you answer that in a way that recognizes your skills and creating great results with AI and doesn’t diminish the person’s ability?

01:20:29.407 –> 01:20:34.587
JESSICA: So I think there’s another question, too, in there about like, people say that it’s cheating to use Chat GPT.

01:20:34.587 –> 01:20:47.607
JESSICA: And so I’ll kind of address these collaboratively because I get these in my presentations a lot, is when you’re using Chat GPT as a thought partner, you’re taking the knowledge that you have and you’re refining it in a way that’s presentable to the professional world.

01:20:47.607 –> 01:20:49.327
JESSICA: That’s the way that I’m using Chat GPT.

01:20:49.927 –> 01:20:57.547
JESSICA: I don’t have to worry about being questioned about the way that I use Chat GPT because I can still speak to the knowledge that I have.

01:20:57.547 –> 01:21:03.327
JESSICA: And I was fine, let me just say this, as a woman, it’s never been my first natural instinct to speak with my knowledge.

01:21:03.327 –> 01:21:07.167
JESSICA: Like it’s never my first instinct to give my opinion.

01:21:07.167 –> 01:21:13.927
JESSICA: By the time that I’m at a place where I’m ready to give my opinion professionally, I’m very confident in the words that I’m saying.

01:21:13.927 –> 01:21:17.527
JESSICA: So if you have the knowledge, it’s just refining it.

01:21:17.627 –> 01:21:18.847
JESSICA: It’s your assistant.

01:21:19.667 –> 01:21:24.447
JESSICA: It’s you’re the CEO and you gave something to your assistant and you said, hey, can you refine this for me?

01:21:24.687 –> 01:21:27.667
JESSICA: It’s still your general thoughts and ideas.

01:21:27.667 –> 01:21:31.547
JESSICA: So there’s not really like that concern in the aspect.

01:21:31.547 –> 01:21:37.587
JESSICA: Like if you’re going to chat GBT and you’re saying like write an article on something like really benign, yeah, that would be embarrassing.

01:21:37.587 –> 01:21:42.047
JESSICA: But using your own knowledge is really what the foundation of the difference is.

01:21:42.667 –> 01:21:50.607
JESSICA: And when I was starting high when I was in high school and early university days, like Wikipedia was just first becoming a thing.

01:21:50.607 –> 01:21:52.947
JESSICA: And they would tell us you can’t use Wikipedia as a source.

01:21:52.947 –> 01:21:54.407
JESSICA: It’s not reliable.

01:21:54.407 –> 01:21:58.047
JESSICA: And just like in the Internet, I’m sure that they pulled the same thing with the Internet.

01:21:58.047 –> 01:22:05.567
JESSICA: You can’t trust the Internet is we’ve gone through these variations in history where it’s a new tool.

01:22:05.567 –> 01:22:07.327
JESSICA: So people aren’t sure they want to trust it yet.

01:22:07.327 –> 01:22:09.647
JESSICA: They haven’t built up that like, oh, it’s Google.

01:22:09.647 –> 01:22:10.647
JESSICA: Of course I can trust it.

01:22:11.147 –> 01:22:18.547
JESSICA: So it’s going to take some time for people to get comfortable and adopt with the idea of like using ChatGPT.

01:22:18.547 –> 01:22:22.587
JESSICA: But people have been using copywriters for ages.

01:22:22.587 –> 01:22:26.087
JESSICA: People have been having people build their presentations for them.

01:22:26.087 –> 01:22:30.407
JESSICA: So you still speak to the knowledge of the presentation at the end of the day.

01:22:30.407 –> 01:22:33.447
JESSICA: So it really is like, is this what I know?

01:22:34.467 –> 01:22:36.487
MAGGIE: Yeah, I think that’s great.

01:22:36.487 –> 01:22:45.647
MAGGIE: There’s a question here around, there’s a few actually, around utilizing AI to help make the move from EA to Chief of Staff.

01:22:45.647 –> 01:22:48.727
MAGGIE: And I would love to hear answers on this.

01:22:48.727 –> 01:22:49.667
MAGGIE: Lawrence, do you want to start with you?

01:22:49.667 –> 01:22:53.947
MAGGIE: I know you’ve interviewed hundreds of Chiefs of Staff to build ambient.

01:22:53.947 –> 01:22:58.107
LAWRENCE: Yeah, so I have.

01:22:59.107 –> 01:23:04.707
LAWRENCE: I think the line can be murky between a strategic trusted EA and a Chief of Staff.

01:23:04.707 –> 01:23:06.387
LAWRENCE: I think there’s a lot of gray area there.

01:23:06.387 –> 01:23:11.287
LAWRENCE: So I think it’s not really about the tools that you’re using that makes you a Chief of Staff.

01:23:11.287 –> 01:23:19.607
LAWRENCE: I think it’s about the trust and the access and the projects that you are working on.

01:23:19.607 –> 01:23:26.987
LAWRENCE: And so I think my advice would be for any EA is take the most time-consuming part of your job.

01:23:26.987 –> 01:23:37.307
LAWRENCE: If you’re taking minutes at meetings right now, see if you can bring that time down by 90% by using ambient, by using an AI note taker.

01:23:37.887 –> 01:23:43.847
LAWRENCE: If you are loading Action Items Next Steps into a project management system, man, I have a tool for you.

01:23:43.847 –> 01:23:45.827
LAWRENCE: You got to get on ambient tomorrow.

01:23:45.827 –> 01:23:57.027
LAWRENCE: And I think I will share a market map in part of the follow-up materials of every single workflow, the calendaring, the drafting decks, that goes right down the list of tools that you can try.

01:23:57.027 –> 01:24:12.207
LAWRENCE: So I think the unlock for EAs to get to more strategic work, and you can call the title what you want, is to find ways to automate the manual stuff and free yourself up to work on the more strategic projects.

01:24:12.207 –> 01:24:14.067
LAWRENCE: So I’m not even going to go into the title distinction.

01:24:14.447 –> 01:24:15.487
MAGGIE: I’ll stick with it.

01:24:15.487 –> 01:24:15.887
MAGGIE: Yeah.

01:24:15.887 –> 01:24:18.087
MAGGIE: I appreciate that.

01:24:18.367 –> 01:24:19.727
MAGGIE: So let’s see how this will work.

01:24:19.727 –> 01:24:22.107
MAGGIE: Yes or no from each of you.

01:24:22.107 –> 01:24:28.547
MAGGIE: If you produce something from Chat GPT, should you tell your executive that it was birthed from AI or keep that to yourself?

01:24:28.727 –> 01:24:29.987
MAGGIE: Fiona, yes or no?

01:24:30.527 –> 01:24:33.027
MAGGIE: Tell your exec that you used AI or not?

01:24:34.127 –> 01:24:36.567
FIONA: I think it depends on the context.

01:24:36.567 –> 01:24:56.687
FIONA: I think it also just to add on to what was already said, I think Jessica had a great response to this by the way, is if your starting point with these tools is that they are co-pilot, they’re a thought partner, they’re filling in for the deficiencies that you have like I said earlier, then of course, why wouldn’t you use these tools?

01:24:56.767 –> 01:25:06.887
FIONA: It would be silly to leave these, you’re leaving money on the table, not using these tools to help you polish and refine and fill in your own blind spots on a piece of work.

01:25:06.887 –> 01:25:12.067
FIONA: And so I do think so in certain cases, it might be a good idea to reveal if you have used it.

01:25:12.067 –> 01:25:20.507
FIONA: One of my connections who supports the venture capitalists here in London, new to the job, her boss said, could you actually help me rewrite my bio?

01:25:20.507 –> 01:25:24.627
FIONA: So she did it with ChatGPT and it had a couple of things in it that were factually incorrect.

01:25:25.207 –> 01:25:30.367
FIONA: Luckily, they had a laugh about it and it wasn’t a career killer or anything like that for her.

01:25:30.367 –> 01:25:40.867
FIONA: But I think the point I’m making is if it’s something fact-based like that, you really do need to be very careful about using a tool that is prone to hallucination like ChatGPT.

01:25:40.867 –> 01:25:46.107
FIONA: I would definitely disclose it in that case because of the level of risk there.

01:25:46.107 –> 01:25:48.667
MAGGIE: Any disagreement at all on the panel with what Fiona said?

01:25:48.667 –> 01:25:51.707
MAGGIE: Or you guys need the context before you can answer the question?

01:25:51.707 –> 01:25:52.027
MAGGIE: Yeah.

01:25:52.327 –> 01:25:53.887
LAWRENCE: Contexting, you need to know your principle.

01:25:53.887 –> 01:25:55.507
LAWRENCE: You need to know your CEO.

01:25:56.627 –> 01:25:57.707
LAWRENCE: Yeah.

01:25:57.707 –> 01:25:58.427
MAGGIE: Yeah.

01:25:58.427 –> 01:26:00.087
MAGGIE: Yeah, that’s a great point.

01:26:00.087 –> 01:26:05.207
MAGGIE: Jessica, how do you showcase AI proficiency on a resume?

01:26:05.207 –> 01:26:06.207
MAGGIE: What do you say?

01:26:06.207 –> 01:26:07.107
MAGGIE: What do you list?

01:26:07.107 –> 01:26:13.567
MAGGIE: How do you talk about how you can differentiate yourself from others based on your AI use?

01:26:13.567 –> 01:26:14.247
JESSICA: I’m going to be honest.

01:26:14.247 –> 01:26:16.007
JESSICA: I’m the last person you should ask about this.

01:26:16.787 –> 01:26:24.347
JESSICA: I wasn’t really hunting for jobs very long for when Chat GPT was available.

01:26:25.547 –> 01:26:26.287
JESSICA: I’m sorry.

01:26:26.287 –> 01:26:28.767
JESSICA: I could not give you an answer on this one.

01:26:28.767 –> 01:26:33.087
MAGGIE: Fiona or Lawrence, do you have anything to answer here?

01:26:35.047 –> 01:26:38.207
LAWRENCE: I mean, from founder perspective, LinkedIn, not resume.

01:26:38.207 –> 01:26:40.907
LAWRENCE: Sorry.

01:26:40.907 –> 01:26:41.367
LAWRENCE: Go ahead, Fiona.

01:26:41.367 –> 01:26:41.747
LAWRENCE: I’m sorry.

01:26:41.747 –> 01:26:43.327
FIONA: Yeah.

01:26:43.367 –> 01:26:44.347
FIONA: I think we’re talking over each other.

01:26:44.347 –> 01:26:45.207
MAGGIE: I’ll let you go first.

01:26:46.587 –> 01:26:48.707
MAGGIE: Lawrence, you go ahead and finish what you’re saying for sure.

01:26:48.707 –> 01:26:49.367
FIONA: Yeah.

01:26:49.647 –> 01:26:51.367
LAWRENCE: I think it’s LinkedIn, not resume.

01:26:51.367 –> 01:26:53.147
LAWRENCE: I think you are what you do.

01:26:53.147 –> 01:27:06.247
LAWRENCE: If there’s activity on your LinkedIn that shows your interest or your fluency or your around this domain, I think that’s the obvious way to stand out and to associate yourself with this skill set.

01:27:06.247 –> 01:27:07.327
MAGGIE: That’s a great point.

01:27:07.327 –> 01:27:07.587
MAGGIE: Yeah.

01:27:07.587 –> 01:27:10.267
MAGGIE: Fiona.

01:27:10.267 –> 01:27:10.487
FIONA: Yeah.

01:27:10.487 –> 01:27:21.707
FIONA: I was just going to say that the more sort of courses and things that you do, obviously you can put those on your CV, on your LinkedIn, always a great way to demonstrate that.

01:27:21.707 –> 01:27:25.007
FIONA: I think it’s also a great talking point for interviews as well.

01:27:25.007 –> 01:27:28.847
FIONA: Bring it up and ask, how are you thinking about AI right now?

01:27:28.847 –> 01:27:30.707
FIONA: What is your strategy around AI?

01:27:30.707 –> 01:27:34.827
FIONA: Am I going to be able to use these fantastic tools out there in this role?

01:27:34.867 –> 01:27:39.547
FIONA: I think that will show that you are someone who’s forward thinking in this space.

01:27:39.547 –> 01:27:40.107
MAGGIE: Yeah.

01:27:40.107 –> 01:27:51.547
MAGGIE: I would probably add that if it is a role where you want to highlight your AI ability, there’s nothing wrong with adding a bullet or two around, I am well-versed in a number of different AI tools.

01:27:53.247 –> 01:27:56.507
MAGGIE: I’ve used X, Y, and Z in the past to create these things.

01:27:56.507 –> 01:28:06.087
MAGGIE: To show your expertise is a good thing, and thinking about how you do that and which words you use to demonstrate your expertise is fine and helpful for sure.

01:28:07.147 –> 01:28:08.207
MAGGIE: Okay.

01:28:08.207 –> 01:28:08.827
MAGGIE: Let’s see.

01:28:09.167 –> 01:28:12.007
MAGGIE: One more question here.

01:28:12.007 –> 01:28:12.887
MAGGIE: This is a fun one.

01:28:12.887 –> 01:28:15.687
MAGGIE: We’ll go around again from each of you.

01:28:15.687 –> 01:28:21.747
MAGGIE: Fiona, what are the top three AI apps that you would recommend for C-suite users?

01:28:22.907 –> 01:28:25.567
FIONA: Oh, that’s a tough question.

01:28:25.567 –> 01:28:30.487
FIONA: I would have to say, okay, number one is ThoughtGPT.

01:28:30.487 –> 01:28:34.527
FIONA: Number two is Taskade, which I just mentioned.

01:28:34.527 –> 01:28:36.127
FIONA: Absolutely adore it.

01:28:36.127 –> 01:28:39.287
FIONA: I would say number three is VimPal for calendaring.

01:28:40.467 –> 01:28:41.107
MAGGIE: Okay.

01:28:41.107 –> 01:28:42.927
MAGGIE: Jessica, you’re up.

01:28:42.927 –> 01:28:46.007
JESSICA: So, ChatGPT obviously number one.

01:28:46.007 –> 01:28:48.507
JESSICA: I like EA Buddy for scheduling.

01:28:48.507 –> 01:28:53.007
JESSICA: If then they have a ChatGPT bot in there and they have some other really neat features.

01:28:53.007 –> 01:28:58.727
JESSICA: And then for Magic To Do, that’s a really great one for breaking down things into steps.

01:28:58.727 –> 01:29:03.047
JESSICA: I find that I tend to overestimate how much time something will take.

01:29:03.047 –> 01:29:05.587
JESSICA: So, I can use Magic To Do to break it down.

01:29:05.587 –> 01:29:07.707
JESSICA: And they also have a time estimator.

01:29:07.707 –> 01:29:10.747
JESSICA: So, it’s a really cool tool that’s free as well.

01:29:10.747 –> 01:29:11.087
MAGGIE: Awesome.

01:29:11.087 –> 01:29:11.967
MAGGIE: Thank you, Jessica.

01:29:11.967 –> 01:29:14.687
MAGGIE: And Lawrence, we’ll end with you.

01:29:14.687 –> 01:29:15.487
LAWRENCE: Yeah.

01:29:15.487 –> 01:29:17.707
LAWRENCE: Agree on VimCal for calendaring.

01:29:18.727 –> 01:29:18.927
MAGGIE: Yeah.

01:29:19.147 –> 01:29:20.847
LAWRENCE: Let me just go through my list.

01:29:21.147 –> 01:29:22.927
LAWRENCE: I like for presentations.

01:29:22.927 –> 01:29:24.847
LAWRENCE: I like Tome.

01:29:24.847 –> 01:29:28.007
LAWRENCE: I like beautiful AI.

01:29:28.387 –> 01:29:29.467
LAWRENCE: I’m going to give a couple more.

01:29:30.187 –> 01:29:35.287
LAWRENCE: I think for email, Superhuman is doing interesting stuff with AI.

01:29:35.287 –> 01:29:45.427
LAWRENCE: I think for kind of like summarization of various like email and stuff like that, there’s a company called The Gist, which is doing some interesting stuff.

01:29:45.427 –> 01:29:49.847
LAWRENCE: I think for meeting notes, I think most of the core ones are decent.

01:29:49.847 –> 01:29:57.847
LAWRENCE: And I think if you’re interested in AI-powered next steps, tasks, routing to different systems, there’s really only one and that’s ambient.

01:29:58.847 –> 01:29:59.387
MAGGIE: Awesome.

01:29:59.387 –> 01:30:00.947
MAGGIE: Well, we will end there.

01:30:00.947 –> 01:30:04.267
MAGGIE: This has been an amazing panel.

01:30:04.267 –> 01:30:06.387
MAGGIE: Fiona, Lawrence, Jessica, thank you so much.

01:30:06.387 –> 01:30:09.827
MAGGIE: Thank you for allowing me to moderate this discussion.

01:30:09.827 –> 01:30:11.307
MAGGIE: Thank you for everyone who joined.

01:30:11.307 –> 01:30:19.627
MAGGIE: And we will be getting an email summary to you with a lot of different links and highlights from the conversation and the recording to watch as well.

01:30:19.627 –> 01:30:22.167
MAGGIE: So everyone have a lovely day wherever you are in the world.

01:30:22.167 –> 01:30:24.127
MAGGIE: And thank you so much for joining us.

01:30:34.976 –> 01:30:37.136
<v SPEAKER_3>Please review on Apple Podcasts.

01:30:43.775 –> 01:30:45.475
<v SPEAKER_3> goburrows.com.

Download FREE Chapters