Skip to content

How To Navigate Difficult Conversations with AI: Introducing Omi Chandiramani's Good Talk App

Watch the episode here:

Listen to the podcast here:

Ā 

Episode Summary:
In this episode of Care More Be Better, host Corinna Bellizzi interviews Omi Chandiramani, a technologist and the creator of the Good Talk app, which helps users navigate challenging conversations through role-playing with an AI chatbot. Omi shares his journey in developing the app, the importance of effective communication, and the potential of AI in enhancing conversation skills. They also discuss privacy concerns, AI's limitations, and the future of AI in various fields like education and therapy.

Time Stamps:

<00:00:00> - Omi Chandiramani's Background and Inspiration: Omi shares his background in tech and the inspiration behind creating Good Talk.

<00:03:43> - Climatize and Social Impact: Discussion on Omi's previous role as CTO at Climatize and its impact on climate change action. Listen to that episode here.

<00:05:24> - Developing the Good Talk App: Omi explains the challenges of building scenarios for the app and how real-life user input enhances its effectiveness.

<00:10:00> - User Privacy and Data Security: Omi details the privacy measures taken to ensure user anonymity and data security.

<00:14:55> - Artificial Intelligence and Its Applications: Insights on AI's potential, limitations, and its role in tools like Good Talk.

<00:20:22> - AI Hallucinations and Guardrails: Explanation of AI hallucinations, their implications, and how Good Talk addresses them.

<00:24:08> - Good Talk's Application in Various Fields: Discussion on the app's potential use in sales training, education, and therapy.

<00:28:17> - Voice Communication and Future Enhancements: Omi shares upcoming features like voice interaction and sentiment detection.

<00:30:32> - Supporting Multiple Languages: Plans to expand the app's accessibility by supporting multiple languages.

<00:33:28> - Accessibility Across Platforms: Appreciation for the app being available on both Android and Apple devices from the start.

<00:37:22> - Closing Thoughts and Future Developments: Omi's vision for the app's future and a call to listeners for feedback and ideas on new applications.

Resources and Links:

Connect with Us:

Thank you for listening and being part of this community. Together, we can care more, be better, and create a positive impact in the world.


How To Navigate Difficult Conversations with AI: Introducing Omi Chandiramani's Good Talk App

Good Talk App Logo
The Good Talk app's logo, pictured above

Corinna Bellizzi: Hello, fellow do gooders and friends. I'm your host, Corinna Bellizzi. Have you ever had a difficult conversation or a negotiation that you soon realized wasn't going quite the way you'd hoped? Today's guest is going to help us figure out how not to run into that exact situation or how to be better prepared.

I'm thrilled to introduce you today to my friend, Omi Chandiramani. Omi has worked in tech for about 28 years at various big tech companies and startups, including Google and Amazon. He grew from being a software engineer into leading global product and engineering teams. He learned by doing building and coaching cross functional teams along the way.

And in this process. He learned that he had a deeper need to help those around him navigate some really challenging conversations. Today, he's here to talk about this problem and the tool that he's helped to create to solve it. Good talk. Omi, welcome to the show.

Omi Chandiramani: Thank you, Corinna. Really appreciate you having me on here and I'm such a huge fan of the show and you and so many of your guests. So it's truly an honor to be here.

Corinna Bellizzi: You were the CTO for a startup company in the Santa Cruz area that we interviewed. Do you want to talk about that for a moment?

Omi Chandiramani: Sure. Yeah. I was CTO for a company called Climatize and their goal was to create a platform which helped people invest in positive climate change action.

So super thrilling company and it was amazing to be a part of that journey. I helped them launch the initial platform, which would accept. Funding from the crowd. That was their model. And they've done really well in the last couple of years. I was just checking this morning and they're close to having raised four million dollars on this platform that we launched together and it's all being put into climate change action.

Mostly a lot of solar farms being built, but more and users of that platform actually earn a return. So it's truly an investment and some of the returns are, Eight to 10%. So that's just a tremendous success story. And it was a huge pleasure for me to be a part of that.

Corinna Bellizzi: We did interview their CEO and founder, Will Wiseman, a few months ago. I had learned at that point, I think they were at just over 2 million. So to have doubled in that short time is pretty incredible. I'm. Proud to see that there are more opportunities for people to invest in their local communities and to be able to see those actions take place because you can essentially invest in a solar farm in your backyard that helps to create more viable energy for the future.

Care More Be Better | Will Wiseman | Crowdfunding
Crowdfunding: From a high level, the government has leaned in and started to supercharge the deployment of green energy infrastructure. Listen to the episode featuring Will Wiseman and Climatize here.

In a local way, and you can actually see that impacting your community, which is, I think, really incredible. And I don't mean your specific backyard, but, the backyard area of where you live. I was wondering if, as we get started here, you can go ahead and share with our audience a bit about what inspired you.

To really develop this app and lean into this full time because heck, you've been, instrumental in helping CEOs and large companies create a lot of revenue and the wealthy become wealthier. So what would you have to say about that? Where are you today?

Omi Chandiramani: Yeah. Yeah. It's a bit of a story and, it's a few different threads, in my being and my personality that have merged into creating this product.

So to start off with just, yes. About the power of language. I've always been fascinated by language. I remember as a high school kid when I was studying for the SATs where essentially you've got to study a dictionary. I started finding words where I thought I knew what they meant, but they actually meant something else.

And this just fascinated me because Often the real meaning of the word was something far more nuanced and even beautiful, than what I thought they meant. And it's words like, fantastic or ironic or literally, oh gosh, don't get, literally kind of. Makes me a little upset, you know, as to what's happened to that word over the last few years.

That's a different show. But I was definitely that nerdy kid who would say, you know, you know, fantastic actually means this other thing and you're using it wrong. But that was that guy. And building on this was just a fascination with communication in general. I remember reading Men are from Mars and women are from Venus.

And I was equally amazed by just the power of being aware of our differences and context and using that to create understanding and connection. But also I was just like, how the heck do we ever communicate at all? You know, beyond, simple statements like the cat sat on the mat and it would just, Flummoxed me that communication happened at all.

I wanted to study it even more. And then when I was, I started working and especially when I started managing and coaching teams, like you mentioned, that was another gear shift in my study of, communication and influencing others. And I got to experience how a single good conversation.

When done right could really change everything like you could change the whole trajectory of project. But or a person's career and their career choices and you know their life. And this was always so thrilling to be a part of. So this communication, this has been a thread in my brain for as long as I can remember.

Not saying I'm an amazing communicator, but I'm Certainly a very persistent student, and I love studying it. So that's one aspect of this. The other aspect has been, just my journey as a colored person in America, that Hasn't been entirely smooth, especially just dealing with race relations and the understanding of race in America.

And as I've, become an adult and had a family of my own. And my family was going through this as well with, we the anxiety of the last microaggression or the frustration of it is always fresh. just, things just keep happening at work or at school or, just in the grocery store.

But then absolutely, I recognize that my experience as an Indian person and Indian origin in America is just a tiny, tiny part of a much bigger social issue, that affect black people in America. And despite huge strides in civil rights, black Americans continue to have to face systemic inequalities across various dimensions of life, often paying with their lives.

And what I was experiencing was just, it was just a small tip of that huge iceberg. So from, 30 years of living in America, I have, I've grown a, you know, I felt a growing responsibility to be Part of some progress on these issues, not just for me, but for all people of color in America.

Corinna Bellizzi: And you are writing about a particular story with your daughter where, you're experiencing your daughter coming of age, going to high schools in America and things like someone being weird or judgy about the food that she had in her. Lunch and how she might handle that microaggression as one specific.

I wonder if you could talk about that because I hadn't necessarily, when we first started talking about what you're doing with this app, really even thought about it as a tool for teenagers. When kids are going through one of the most difficult periods of their life, they might be struggling to be heard.

They might even run into friction with their parents and feel like, why am I just in this situation? I can't be heard. Like I'm just being ignored. How could this tool help somebody? In that situation.

Omi Chandiramani: Yeah, totally. The story about my daughter. You know, uh, when I started building, I decided I needed to do something about you know, this, these racial divides in America and I landed on the idea of, uh, well, it's got to be a technology app of some kind.

And I struggled with that. The thing that kept getting into my head was, you know, most people in America don't even know how to talk about race. So how can I create something to help people build their awareness if we can't even talk about this thing? Um, and that's when I landed on this idea of.

Well, maybe let me attack that problem. Let me help people build, you know, uh, the skill to talk about race. And that was one of the early versions of the app where I would just set up scenarios where people could talk about race. So I started building these scenarios and one of the early ones, you know, the week I was working on this, our local BLM sign got vandalized and it, I don't know if it was the first time or the second time, but it was one of the two In Santa Cruz.

And so that was a scenario. The scenario it set up was you're going to talk to your black colleague about the sign getting vandalized. And you know, what is her experience? What is she going through? What is her family going through? So that day, my daughter came back from school and I was like, Hey, I've started setting up scenarios for this app.

And she's a very precocious 12 year old, very eager about what I'm doing and eager to learn about what I'm doing. So we would talk about these things a lot. Yeah. So she came back from school and had a snack and stuff. And she came up to my desk and was like, Hey, what's going on? I was like, okay, I'm setting up scenarios.

And here's one I set up about the BLM sign. Can you think of a scenario? And immediately in a heartbeat, she came up with the scenario that you're referring to, which was that. Somebody had made fun of her lunch because my wife is an amazing cook.

Corinna Bellizzi: And for anybody, if you spoke about microaggressions earlier, that could absolutely be a microaggression because food is so often tied to culture and they're basically teasing that, right?

Omi Chandiramani: Totally. Yeah. I mean, it didn't look like You know, a peanut butter and jelly sandwich and nothing wrong with that. But it was something this other kid just thought was weird. And, you know, uh, poked at my daughter about that. And my first thought was like, is this really about race?

But then my second thought was, Oh my gosh, like. I couldn't have. That's an amazing conversation to practice, and I could never have come up with that. Even a I could never have come up with that. Well, maybe, you know, in 100 tries or something would have come up with something so kind of real and gritty and meaningful to my daughter.

So that was kind of a light bulb moment for me was like, I shouldn't be trying to create these scenarios because users are going to make scenarios that are far more rich and far more meaningful than anything I'm going to make. And the second one was, it was, it was about race, but what if it hadn't been about race?

Maybe I shouldn't be limiting this tool to help conversations only be about race. It should just, it's just about everything. It's about every conversation. And if a middle schooler wants to talk about. How she responds to a bully and so be it. Then, you know, I have a responsibility to help that person.

three beautiful black people seated next to one another with white text painted on their skin stating "I will not be silent so that you can be comfy", "my son is a threat" and "black, beautiful, powerful, strong".
Three people of color sit in solidarity, white text drawn on their skin, speaking their truth in silence. "I will not stay silent so that you can be comfy", "My son is a threat." and "black, beautiful, powerful, strong" are their core messages.

But this is kind of how I got the app to its current state, which is all about users. Just quickly creating scenarios that they want to discuss. that they want to practice. And then practicing them with this role playing chat bot and getting coaching along the way on how to improve their effectiveness in that conversation.

So that was one of the first conversations that were ever done on the early version of the app, which is my daughter learning how to talk to a bully in her school.

Corinna Bellizzi: So in the app, people can explore situations that are already kind of baked in, and then I think they can also explore other situations that other people have posted anonymously. Is that correct?

Omi Chandiramani: Yeah there's nothing that's truly baked in, you know, the scenarios that, Are in the app are all created by real users. So they can, search through them to see if any of those scenarios are relevant to their experience as well. And they want to practice that you know, negotiating your salary with your boss or making or having an amicable breakup conversation.

You know, there's certain, there's so many commonalities in our human experience that those scenarios amongst the most popular in the app. Today but then if you have something specific or you can't find what you're looking for, you can just quickly create a scenario about, I want to talk to my aging dad who needs to you know, move to an assisted living situation, but he's resisting it because of losing independence and so on.

So you can just set that one up. And that's a real scenario, by the way, that somebody said, and set up in the app recently. You can just set up whatever scenario you want to discuss. and practice it immediately and get coaching immediately.

Corinna Bellizzi: So I have a couple of questions that relate to artificial intelligence, but before I head into that, I'm also just really curious if you can share a little bit more about how The security of the individual that's participating is protected because, I would think they want to be sure that this is completely anonymous.

That there, there might be some sensitive discussions that they're putting out into the world and that they might not want for it to get out that let's say they're using this tool to somebody in another way. So what have you done to protect the users?

Omi Chandiramani: For sure. Yeah. Security and privacy. You know, I care about it very, very much in my own personal life and absolutely projected that into the architecture of this app.

It's something I thought of from the ground up. A lot of the time this concern is about, well, you're going to learn a lot about me as a user in the app and, you know, how are you going to use that? So I want to just get rid of that whole issue. So the app doesn't have the concept of an account or a user.

And so there's no, there's nothing to associate conversations even to each other. I, I basically don't even store that data. I don't collect an email address. I don't know who the user is. So there's no, there's no way to. There's no data to leak in some sense because there's no association with an actual user.

So I, I take that extremely seriously, privacy concerns are, have been at the forefront of the design of this application. And what's sacred to me is the trust of a user. And that's goes into kind of every technical decision I made around how data is captured, transferred, stored, if at all. So yeah, no, I, I totally get that concern. And for now the app is just completely anonymous. There is no concept of a user. So there's really no data to leak actually.

Corinna Bellizzi: And so you just have the app on your phone, you use it as a tool and it stores scenarios, but with no tie to you specifically,

Omi Chandiramani: none, none whatsoever. Yeah. You can start using the app and practice these and create and practice scenarios without creating an account. In fact, there is no concept of an account in the app right now.

Corinna Bellizzi: That's kind of comforting. I would think for people to hear that. Now artificial intelligence is of course one of these arenas that's growing more swiftly than any other arena.

In fact, if you follow tech stocks, it's like NVIDIA, Google, Amazon, they're all at war to gain a leading edge in this particular space. But there's some room. Transcribed by https: otter. ai I think uncertainty about where this can lead. I earlier had actually interviewed Mo Gowdat, who was the former chief business officer at Google X.

And he published a book called Scary Smart and really talked about how artificial intelligence could both, both be the savior that helps to create the utopia we want to live in or something worse. So where do you stand when it comes to what AI is as a tool? It's potential and it's also, let's say the potential threats of it.

Omi Chandiramani: Yeah. I've read articles from this guest you're talking about more, and I share some of his, his thoughts. The main one being that it's actually, it's never about the technology itself. It's always about the people who control it and use it. And also about the people who should be regulating its use technology, including AI.

You know, can't independently act unless we humans enabled it to do so. Terminator hasn't shown up from the future and isn't destroying society to satisfy its own.

Corinna Bellizzi: Thanks for making me laugh about it. Yes.

Omi Chandiramani: Yeah. I mean, that's just not what's happened. That's not what we've created. We've created a very, very, very sharp knife.

And it's up to the humans involved to ensure that we don't cut what we don't want to cut. And it's up to us to make sure that it actually improves the human experience. So in some ways, this isn't a new problem. What's unique about generative AI is, is that it isn't deterministic, right? Given the same inputs, It can produce different outputs.

And for the first time, we've got a technology that can recombine existing ideas and solutions in creative ways. We've never had something that could do it this well. And at this scale, as we've seen in the last few years, but these solutions, these creative things, it makes up can be beneficial or harmful.

And that's the problem. Now as a technologist, the solution is to just make sure we have, you know, just make sure we have robust guardrails in place. That are appropriate for the actions. We apply the technology to so build a really good sheet for that knife and make sure it's only allowed to be the knife's only allowed to be taken out in the right circumstance for the right task.

And that just is in quotes this. I think this is where we're going to struggle for a little bit with AI. I think expecting big tech to ethically self regulate is unrealistic. You know, many tech companies they're investing in ethical AI research and frameworks. However, their primary motivation is their bottom line, which they want to be the

Corinna Bellizzi: top dog and the quickest way to get there and explore each of those things. I was listening to an earlier podcast on another show. Yeah. I'm forgetting which one it was because sometimes I'm drifting between episodes of different podcasts, but this one in particular was really going through the concept of hallucinations in the sphere of AI. And it was the first time I'd actually heard that terminology.

I'm sure it's not new to you, but I was hoping that you could also describe for our audience what this really means in the context of artificial intelligence and why it's important for something. Like a tool like good talk that you've worked to create.

Omi Chandiramani: Yeah, hallucinations are all about this. What I was mentioning earlier about this recombining of existing ideas and solutions or words and sentences really, like what chat GPT does.

And it's combining it in ways that aren't constrained by our human expectations or logic or rationale

Corinna Bellizzi: more like a child would

Omi Chandiramani: Yeah, or even, just shaking up words in a box and kind of laying them out in some order that sometimes some make some kind of sense to these models.

I mean, it's even more random than I think a child can would do. But It's actually these hallucinations. You know what we call a hallucination is when it creates something creative or interesting that doesn't make sense to us, or it doesn't make sense to the question we asked. And then we call it a hallucination.

What I'm trying to say is that, it's this random nature of it, or this non determining nature of it that makes it create really interesting solutions. It's just that when those interesting solutions don't line up with what we expect when Then we tend to call it a hallucination. So all this is says we want something that hallucinates.

We've just got to make sure that we can identify what the harmful hallucinations are versus what the productive ones are. And that's how I think about that. Using AI and good talk is, and in this hallucination concept. It's definitely something I had to engineer around where sometimes in its role playing on its coaching it, it plays the role in a, in a way that isn't consistent with the persona you've created.

Okay. I'm talking to my age dad and, he's not going to use, you know, Gen Z slang. So don't do that. Or, I've asked the coach for some help and it either gets repetitive or the coaching isn't helpful to what the context is. So I've had to build guardrails into, okay, once the AI spits out something, I have guardrails onto what's allowed and what's not allowed.

For example I have detections on if somebody talks about doing self harm and I track that and I have a guardrail there where if some conversation goes that way, I say, all right, halt the conversation and give the user something, some resources to reach out to a real person to get some real help.

This is not. A conversation that they should be having with an A I chat bot. That's just not going to be helpful in any meaningful way. Or if they ask for medical advice, I pause the conversation and say, Hey you know this. This is not a useful tool for getting medical advice. You can talk about medical things, but you're actually getting advice.

Then you should go to a doctor. So it's this mix of I want the creativity, but I've got to put guardrails around when the creativity is a little bit too creative.

Corinna Bellizzi: I think that's a great explanation and I also think people can probably visualize what I'm talking about when we say the hallucinations that AI will make.

When you think about some of the AI generated art that we've seen or mock photographs and suddenly it looks like somebody's holding somebody else in an impossible way and there's an extra finger or an arm or something where it doesn't belong. That's all part of the hallucination too, but it helps for us to also push the boundaries and think about the platform and its growth and its abilities and then also decide where we have to put the guardrails so that we can get a more effective. use of the technology as time goes on.

Omi Chandiramani: For sure.

Corinna Bellizzi: Frankly, I find it fascinating.

Omi Chandiramani: It is fascinating. And I hope, one, one thing I tell people about is you know, these things about how AI doesn't know how to draw hands.

I hope it tells, it shows people that this technology, has its limits and It's limited in certain in some very silly ways. Like it doesn't know that your hand should have five fingers on it. This idea that it's something that should be feared and it's going to take over and, Terminator kind of ideas.

A lot of that is still science fiction. It's not that it's not powerful, but it can be stupid in certain very simple ways. And, you know, to me that's a sign of this is not something that should be feared. It's just something we need to learn how to use well and and regulate.

Corinna Bellizzi: And we think we all experience AI in the day to day in a way that we can relate to and understand for instance, if you happen to be scrolling through a feed on your favorite social media platform and you pause on something, Then the AI in the background is learning that that's something you might want to see again.

So then they start showing you that thing. You might've paused there by accident. You might've just been ordering a coffee and had to stop for a second. The platform itself doesn't know that that's what you've done. At least it doesn't yet. Right. But then we'll suddenly show you the thing that it thinks you want to see until it learns that that wasn't actually it.

And then kind of goes forward. So my thinking, when it comes to what your platform is here to do is that It will get innately better with time and that you'll see even more effective coaching with time as it learns what the users are really looking for. Do you think that's a fair estimation?

Omi Chandiramani: Yeah, for sure. I think, in that there's a few layers of that. I think the underlying AI models are just going to get better at analyzing conversations and provide coaching. And yes as the platform, experiences more conversations and what works and what doesn't work, it is absolutely going to get better at coaching for sure.

And again, I just want to point out that this learning that the model is that I'm putting the model through again is all anonymous. It's not, Hey, I'm going to coach you specially because I know you make the following kinds of mistakes in your conversation skills. That's not what I'm doing. I'm not going there yet.

Corinna Bellizzi: Good to know. Now, when I hear things like this is going to get better and better at coaching, I also think about what it would feel like to listen to this podcast episode as somebody who has a business as a coach, for example. Do you see something like this? And this technology is a threat to people who have developed careers in this space, or how do you see this fitting in with that?

Omi Chandiramani: Yeah I don't think this is a tool that is a, an independent conversation coach or an ED on an educator of anything or, and I broadly don't think that that's how that's a successful modality for ai. I see ai, one of the modalities that has worked in tech at large is when AI is the assistant to the human.

You've got a junior assistant to the human, somebody to help you. A tool that writes. the email for you or writes the marketing copy or writes the code. I've personally used that in building this platform. But it's my junior assistant. I've got to eventually validate what it wrote, make sure that it makes sense and then put my human stamp of approval on it before it goes forward.

So I think that's a modality that has really proven itself to be very effective in tech so far. And I think of the same way of good talk. I think a conversation coach or a therapist or a counselor of any kind will not be replaced, but could use this as a tool. So a therapist could assign a scenario to their patient to practice.

Hey, try having practice, having this conversation with your spouse or your child and use it as a tool to hone their communication skills, but it's not something that can, Replace the nuance and the wisdom and the experience that a human coach would bring to it. So I see the steward purely as an additive to other more human ways of learning about communication skills.

Corinna Bellizzi: I personally think that it will help remove some of the uncomfortability that somebody that's on the shyer side might have. I think specifically about all of the sales training that I've had over the years where you're trying to imagine a scenario and talk through the product that you're working to promote and then, work through the, the conversation from a need, satisfaction, selling perspective and finding out what their motivations are and then trying to make the case when you're doing this as a group and in an office setting, it can be really difficult for, The more shy individual because they're having to practice it while also feeling like they've got a magnifying glass on them.

So I could see this even being something that gets used by salespeople who are trying to refine their pitches. And I wonder if that could be like another Avenue for you to explore. I'm just thinking this through.

Omi Chandiramani: Yeah, I think there's a lot of places that it can be applied. You know, a bit too many, like anywhere where there's a conversation.

Some of them are not going to be challenging. And then this is a tool that can be applied. Um, so I'm certainly looking for verticals where the tool can have a maximum impact, a maximum social impact. And it's in a in an area where it means something to me. I'm going for social change with this whole project.

So sales, yes, that's a that, that's a area rich with applications for this kind of a tool. But I'm definitely leaning towards education or counseling and therapy. And I'm really just. Keeping my ears to the ground and seeing how users are actually using this tool to explore, where it can have the most impact.

Corinna Bellizzi: Yeah. Now I have played around with the app a bit and earlier today, I just put up a, a quick little screenshot of your platform. Good talk. It has three simple steps, pick a conversation scenario or create one to bot, and then three get instant coaching, repeat till you're ready. And when I played with it personally, I found that I had to go back and kind of rethink it and create a new prompt to get something that was closer to what I thought I was looking for.

Just workshopping a couple of ideas. And I could see, okay, well, maybe I need to think about this differently because I'm trying to get the chat GPT back in to understand what. The actual challenge is and coming up with new scenarios. So it takes a little bit of playing with, but I also found it very easy to use and approachable.

You're not speaking or typing at this time. Do you have plans to add kind of voice communication in the future? So you're doing it audibly?

Omi Chandiramani: Yes, absolutely. And that's kind of the next iteration of this app. What I'm trying to emulate is this, This thing we're doing right now, right? Like real life communication with pauses and frowns and laughter and and, and that's the goal is I want to try and emulate as much of that as possible.

And that's what I'm actively working on right now. Detecting sentiment in your voice, you know, and when the bot speaks back and it will be speaking back soon, it has emotion in there too. And accommodates kind of all these little nuances of a real conversation so that people can really practice you know, the real life skill that I'm, I'm hoping to build in them.

So that's absolutely coming. And, the technology is is not far from supporting all of this. There was a huge announcement from open AI early this week, which is absolutely going to support that kind of a modality. So, that's absolutely coming.

Corinna Bellizzi: So at the present time, it's offered in English. People can find the app in the Apple store as well as in Google play store. Are you planning to release it in other languages or, how does, how, let's say somebody is a native Spanish speaker and they would like to go ahead and utilize the app in Spanish.

Omi Chandiramani: Yeah, for sure. That's definitely on the roadmap as well. I want this, Again, I'm going for the broadest social impact possible. So I'm exploring, where will I find that next? Is it going to be focusing in certain verticals? Or is it going to be focusing on certain languages? This is part of or something else. And this is part of the research.

I'm in the middle of in trying to kind of talk to people and see how it's being used and see what feedback I'm getting. But absolutely multiple language supporters is coming soon.

Corinna Bellizzi: Yeah, that's fantastic. Now I will say, Omi, so often what I've run into, I'm an Android user, I've spoken about this before on the show.

I don't like Apple for a number of reasons, and I don't need to go through it now, but I'm so appreciative when somebody who's working to create an app comes out with an Android version early. I only have Apple devices because I podcast and some apps don't work. On androids even today so I have an old ipad and I have an old iphone, but they sit on the shelf most of the time And I I just have to applaud you for doing that so early in your phase of development for making it accessible to all people that's something that I personally think more companies should do from the beginning and it just makes me proud to call you a friend

Omi Chandiramani: Thank you. Even if I have even if i'm an iphone user Corinna.

Corinna Bellizzi: Yeah, even if you're an iphone user

Omi Chandiramani: Thank you. No, I hear you. And this was an early thing. I said, I'm going to launch on both platforms at the same time. I cannot leave one platform behind. It just doesn't make any sense to me as a technologist to only serve half the population or one part of the population.

So thank you. Thank you for appreciating that.

Corinna Bellizzi: As somebody who's very concerned with our ecology and wants to have products that last. My number one complaint about Apple is how quickly they. They bake in planned obsolescence. And I'll give you an example of the last Apple product that I purchased that I didn't buy used which is my Mac book pro.

I have a Mac book pro that I bought in 2011 and it would still be completely functional and usable, but Apple stopped supporting the upgrades to their OS. And so now my husband, who is a technologist himself has set it up as a Unix machine. So I'm still able to use it, right? But otherwise it would basically be scrap and recycled, which is something that I take issue with.

And so while I might like their phones and like their platforms and their technologies, I don't. Don't like how quickly they bake in the planned obsolescence and my experience with their iPhones. I've broken too many. And then there was this one advertising campaign that was just the, it was the last nail in the coffin for me, which they undertook, I think it was about 2011, 2012, the advertising campaign went like this.

If you don't have an iPhone. You don't have an iPhone. And it was within days of that advertising campaign, really upsetting me that I broke my phone again. And I was like, that's right. I don't have an iPhone. And I got myself a Samsung and I got to tell you, I have. This is my newer Samsung. I have a, this is an S 20.

I still have my S eight. It's fully operational and I use it all the time. It ends up being my backup and a tool that I use for things like podcasting and recording and other settings. So I don't have stock in Samsung, but maybe someday I should buy some.

Omi Chandiramani: Yeah. No, I hear you totally about this planned obsolescence and the waste that it creates and the expense that it means for users.

And I hear you. I'm not a fan of any of that. My only excuse is I'm a technologist and I've focused a good chunk of like the last 10 or 15 years of my career building mobile apps. And I feel like I've just, like every four or five years I switch. Actually, that's, that's my plan to go back and forth, because I feel like I have to for me to keep my skills fresh and relevant. I've got to know how both the platforms are evolving and working at that point in time. So I keep bouncing between the two as part of my own career development.

Corinna Bellizzi: Yeah. Well, I think that's smart. Again I bought my iPhone and my iPad used cause I just don't like that whole newness thing constantly having to get the newest, latest, whatever.

But they work great, you know?

Omi Chandiramani: Yeah. And now, especially when that, you know, getting the new thing means you've got to spend a thousand dollars. I mean, that's good to me.

Corinna Bellizzi: That's also why I bought them used because I was able to pick each of them up for less than half of the price. Right. And so you can buy refurbished and you don't necessarily need to have the brand new and that can help things to manage not only your pocketbook, but get through the fact that it might stop working.

As well as you want it to before you're ready to move on to your next device.

Omi Chandiramani: yeah. I love that. Totally smart. And it's the eco friendly thing to do too. You're kind of reusing something that somebody else threw away.

Corinna Bellizzi: Yeah. Or just wanted to upgrade, you know, like people constantly want to upgrade. As it stands, I just want to say, thank you for the app.

I think it's a really novel Use of the technology that is available to us today. I can see it being very functional for just about anybody who'd be listening to this, and you can go and just download the app and try it out. I will provide links and show notes. I also saw that you have a YouTube tutorial on the use of the app up on your YouTube channel.

I'll include that link with show notes as well. Is there anything else that you wanted to share with our audience as closing thoughts? Or perhaps there's a question that I haven't asked yet. that you wish I had and you could ask and answer that.

Omi Chandiramani: Yeah, I think we covered a lot of what I wanted to bring up. Maybe if I can just double down on the idea of this is definitely a proof of concept for me Text chatting is not the way you know, it's not the flow of a real conversation. And I'm certainly looking for ways in which it can be expanded into new verticals. Like education or counseling or therapy are the ones that are meaningful to me.

But just as a call out if you're listening and you see a particular use for this tool in a vertical that we haven't discussed. Please reach out. I'm trying to learn as much as I can from just the world and people using the app on how this can be useful. I do believe deeply that, a good conversation can change everything, and I'd love to get it everywhere where it can have that effect on society and people and their communities that represent.

So if you have a thought on Hey, we should use this tool somewhere. Reach out to me.

Corinna Bellizzi: And it's presently free, right? There's no charge to load the app.

Omi Chandiramani: That's right. Yeah. And I intend it to be free. This public facing the app, the consumer version of it to be free forever. It's in these verticals that I'm going to try and, build the income and sustainable business part of things where maybe it'll be charged in charge for an education environment or a counseling environment.

But this consumer version of the app that you use and that's available in the app store will be fine as long as I have anything to do with it will be free forever.

Corinna Bellizzi: Yeah, so I see that as a model we see in almost every software as a service model to would seem so where you have a functional app that can be used for basic applications and then something more specialized or with additional features could be a subscription that costs more.

And so I think that makes sense and it makes it very accessible. If you're, if you have teenagers are having difficult times at school, maybe they need this app too, you know, so that they can work through some of those difficult conversations with or without you. And I love that use of it.

And frankly, hadn't thought of it until I discovered that story about your daughter. I think that's so great. And thank you so much for sharing.

Omi Chandiramani: Thank you, Karina. Thanks for the support and the encouragement all along.

Corinna Bellizzi: To find out more about Omi and Good Talk, visit the links that we provide with show notes. You'll find direct links to download the app on your Android or your Apple device, and you'll find a link to our expanded blog, including complete transcripts from this episode. There will also be the video version of this podcast.

I will also include additional resources and links to the prior episodes that we discussed in today's show, including my interview with Mo Goudat, as well as with the founder of Climatize. We're launching also something new for Care More Be Better. This is a cause before commerce site called circleb. co.

This will launch this summer and will feature all sorts of brands that help you live a little greener as well as products are more responsible. It will also help to provide you with tools that enable you to reuse what you have. Reduce your waste and eliminate or limit the plastic use in your household.

There will be DIY tools as well. And I plan to invite people like Miyoko Shinner, who has been a prior guest on Care More Be Better, to share a recipe or two with our audience that they can make at their own home, create your own vegan cheeses as just one, for example. If there are particular products that you think we should engage with, feel free to reach out via the form on https://circleb.co and let me know. Perhaps you represent a brand that you think I should include on this website. Thank you listeners now and always for being a part of this pod and this community, because together we really can do so much more. We can care more. We can be better. We can have more productive conversations and live a little more peacefully as we build the future that we want.

Resources and Links:

Connect with Us:

Omi Chandiramani, CEO & Founder of Good Talk, the AI-enabled app that helps you navigate difficult conversations.

About our Guest:
Omi Chandiramani, CEO and Founder of Good Talk, has worked in tech for about 28 years at various big tech companies and startups including Google and Amazon. He grew from being a software engineer into leading global product and engineering teams. He learned by doing, building and coaching cross-functional teams along the way. He built a tool for his teams of past, present and future ā€“ to help people get better at challenging conversations.