In this compelling episode of Noble Warrior, CK Lin interviews Andy Russell, an internet advertising pioneer turned humanitarian. Andy shares his journey from feeling guilt over his contributions to societal divides through digital technologies to finding purpose in global healing. Discover how he addresses mental health and human suffering through personalized sound medicine.
Are you a successful entrepreneur or professional seeking deeper fulfillment and purpose in life? Join us in this riveting episode of Noble Warrior, where host CK Lin sits down with Andy Russell, a pioneering tech entrepreneur who has redefined his life’s mission from digital innovation to mental health advocacy and sound therapy.
Andy Russell, an internet advertising mogul, has incubated and invested in over 50 groundbreaking technology companies. In this deeply personal and insightful interview, Andy reveals his transformative journey from leveraging data and AI to influence global politics, to realizing the darker impacts of his work, and ultimately, dedicating his life to reducing human suffering through democratized access to sound medicine.
For more information and to listen to the episode, visit Noble Warrior Episode 182.
[00:00:00] ck: Welcome to Noble Warrior. My name is CK Lin. This is where I interview masters about their journey. To master the mind and body and spirit to have greater performance, greater joy, and greater purpose in life and business.
My next guest is a remarkable innovator and a humanitarian dedicated to reshaping the world for the better. He has incubated and invested in more than 50 groundbreaking technology companies is being a driving force behind their collaborative platforms for the United Nations sustainable development goals.
He's on a mission to address the root cause of human suffering, mental health, by democratizing access to personalized sound medicine. Please welcome Andy Russell.
[00:00:55] Andy: So nice to be here. Thank you.
[00:00:57] ck: Beautiful. So Andy, [00:01:00] you and I, we met through another Noble Warrior podcast guest, Jan. And when we first got on our, uh, got on the phone to have a conversation. You asked me a really interesting question right away. He said, where do you find purpose? And then it was a beautiful question because there was very little, you know, how's the weather where you're from?
What do you do? It's just like, where do you find purpose? So I love that. Can you say a little bit about how do you find purpose?
[00:01:34] Andy: Sure. Um, so, you know, I'm 52 in the world of that we live in. You might have accomplished a lot of stuff. And, um, you know, where do you go from there? Um, and to me, I have an 18 year old daughter, I have a 16 year old son.
And I also know that there are a lot of 16 year olds and 17 year olds and 14 year olds out there. There are lots of people [00:02:00] across the planet, um, who are suffering. And suffering greatly, right in this very moment. So, to me, my purpose, and Whether I chose it or it chose me is to help, uh, reduce suffering in the world.
Uh, you know, we, we live in, in a time that's quite remark, remarkable, you know, humanity has evolved through technology and our own, uh, adaptiveness, uh, to pretty much have everything we could possibly need from energy to, to food, to shelter, to a connectivity for education. Um, and now we have AI, which if it goes the right direction would mean.
We can have robots doing a lot of the work we don't want to do. It's like, it's like, ta da, we made it. Um, yet, you know, we're heading in a direction as we can see in the world today of so much violence, so much hate, so much fear, um, which causes physical harm, but also causes [00:03:00] so much mental harm. And, you know, if we came together collaboratively, Uh, around compassion and love, and forgiveness.
Uh, we could just reduce so much suffering. So, you know, in this life, uh, that is my purpose. For those as close to me as my family, to those on the other side of the world, uh, who I will never meet. But just to raise all consciousness and compassion and realize that we made it and celebrate as humanity. So I'm hoping to do my little part or my big part and helping us get there.
Uh, and that's, that's, I mean, that's it. It's straight on. That's my purpose with very little other distractions.
[00:03:44] ck: I love that. Now, was that a realization that you had after having had a few successes under your belt or you kind of knew that all along you were just You know, looking for ways to tactically, [00:04:00] strategically wing yourself from the, you know, basic survival things.
So then you can have the bandwidth to focus on more meaningful work.
[00:04:08] Andy: Yeah, I wish I could say this was my calling since birth. Not even close. I'm born and raised in Manhattan. Uh, I, you know, I went to good schools and, you know, I was conditioned and programmed. Uh, just the way. You know, we're all conditioned in programs to make money, have kids, provide, uh, and then if we can enjoy some of life, uh, so it wasn't really until around 2014 after I saw some of the digital technologies, uh, data and algorithmic, um, functionality that was part of incubating along the way and growing until I really saw it turned to the dark side.
And realize [00:05:00] that a lot of which I had helped build, um, was leading to destruction and harm, uh, throughout society. And it was extremely rude awakening and put me quite frankly, into deep trauma and, and pain and suffering for, and just around guilt, pure guilt. And it's when I realized that, you know, all these great innovations in the wrong hands or following, um, What we've been trained in capitalism to make money.
Um, I'm not saying capital isn't bad, but can, can be very harmful in, in a massive way because of connectivity. It was at that point right around 2014 that I, I just woke up to a really horrible reality. Um, that, um, and I didn't just slept overnight and the rest of my life would be to use the same technologies, but as best as I can.
Possibly. Uh, figure out ways [00:06:00]to, to heal, uh, others on a global basis. Um, so flipping those algorithms on their head, opposed to, uh, leading to mental health issues, uh, to understanding mental health issues and helping heal mental health issues on a global basis because we're all connected.
[00:06:17] ck: I love that.
Thank you. So, you know, not to dwell in the past, but I think it adds colors and, and, and nuances to what you were talking about, because some may say right now in this time, this is another rude awakening moment. Like, Hey, this is, you know, thousands of years of conflict. Expanded, expounded, extrapolated over time, and yet, it seemed like we have made no progress so far, or, and based on mass media, we know we have, but it's just a small group of people, they're making a ruckus, and, you know, anyways, no, not to get into the politics of it all, um, [00:07:00] More pictures of like, what would the indicators of, Hey, the thing that I helped birth into the world is causing harm.
So that way people were like, Oh, I didn't know. That's how I can interpret it. You know what I mean? So if you don't bring more context, you're
[00:07:18] Andy: bringing me deep into my most traumatic moment. We'll go there
[00:07:22] ck: for, for, for teaching purposes.
[00:07:25] Andy: Only for, for, uh, teaching and understanding purposes. That's
[00:07:28] ck: right, that's right, that's right.
[00:07:31] Andy: Sure. So, um, back in, I've built lots of, um, these companies on lots of boards, including the data and marketing association. And I sat on a bunch of boards of big companies, such as American media, which owns all the tabloids, including the national inquirer. And if one might think of those people who read them as yes, educated, but also maybe.
Fairly persuadable. Um, in [00:08:00] 2014, after collective knowledge of where I was in the companies I built, I was able to figure out how to capture what I call a digital scream, um, through, uh, where people pay attention online and, um, which really goes into their subconscious. How do you agitate them? To their deepest anxieties and pain points.
Um, and when I was able to capture that, uh, right around 2014, off of all the email addresses from all these different publications, as some of the ones I just alluded to, uh, I was able to, uh, identify what issues, which, whether it's climate change, or the border, or taxes, education, um, War Whatever it is. I could capture those through Emails that people might open or not open the [00:09:00] subject lines and then once somebody would open one I could track If they clicked on the content and it was email newsletters.
So if they always clicked on Issues with the border one way or the another I could basically psychologically profile them off of just Their engagement. And then with those email addresses, I was able to then upload them directly, the exact people to Facebook, Instagram, Google Display, and YouTube, and then surround sound back to them in video.
And this was the first time video ever went, uh, onto Facebook Mobile in 2014. It was Bucket Challenge. So if you got to capture somebody's digital scream. Then you were able to echo back that screen to them, and something I coined, you know, an echo chamber of digital communications, to actually raise their anxiety levels, basically surrounding them with that reality which they [00:10:00] feared the most, and then being able to then give them an outlet to that.
Which would be either buy this, do that, and just so happens in 2014 was the first time any of this was possible. Uh, and it was two years before the next election, 2016. And because up until then, two billion dollars of ad spend, uh, went to television for political marketing. This is just called weapon grade communication.
Uh, which put television, like, to shame in terms of personalized, um, anxiety building. Um, reinforcing belief systems, uh, that their reality, that they fear the most, exists.
[00:10:46] ck: And that,
[00:10:47] Andy: um, give them an ability to vote. And it turned, obviously, quite sour as time went on. Protest. And, um, And which is, uh, so I [00:11:00] went after the 2 billion market and the stuff I built with partners in the media industry, um, slipped very quickly out of our hands, my hands in particular, and got to, um, Not just our country politics, but then Steve Bannon and everywhere he went around the world to be able to rip apart democracy and turn things among upon themselves.
Um, so fast forward to now, we're living as a completely divided world, divided country. In what we believe to be truth.
[00:11:38] ck: Right. So, so, so pause on that for a second, Andy. I mean, I appreciate the, the analysis and the looking back and, you know, and, and your journey to overcome and neutralize this, this part of your life.
Yes. But, but in looking back, I mean, [00:12:00] again, I'm, I'm trying, I'm, I'm looking at it from an objective perspective, right? So I mean, yeah. It is a capitalistic society. And of course, now these are the tools available. I can't imagine people in that quote unquote weaponize it to today. To the nth degree, right? So I mean, was it not anticipated?
Like, Hey, this is could happen. Like, what do you think that we're going to do with this tool? This powerful tool? You know what I mean?
[00:12:30] Andy: I understand. Um, thank you for the additional guilt, but it's cool. I'm not again, I'm not
[00:12:36] ck: trying to make you feel bad. It's
[00:12:38] Andy: a very, very fair question. Um, so in my idealist idealistic brain, um, it would be a way of understanding people's fears.
And then the politicians could then speak to and address whatever the fears are of who should be their constituents.
[00:12:58] ck: And then the
[00:12:59] Andy: [00:13:00] constituents could vote based on their fears and anxieties. Okay. Um, clearly I was naive and wasn't thinking past, um, how this could be a better way of politicians presenting themselves if they actually had a way of understanding each of the citizens in this country.
[00:13:22] ck: Um,
[00:13:24] Andy: I did not think and I should have thought, uh, in advance of how powerful it is because it is that powerful as we've now seen unfold the polarization of, um, if your belief on one side of an issue versus the other one side of the aisle versus the other, you just live in different realities and with, when you have those belief systems, as you know, throughout, you know, Human existence.
If your belief system becomes strong enough, then you're throughout all history willing to die for a cause or kill for a cause.
[00:13:56] ck: I
[00:13:58] Andy: just didn't have the force of it. [00:14:00] Not at all. Not even close. And you know, I should have and maybe I was blinded by The 2 billion opportunity, but I did not look, where does this go wrong?
[00:14:14] ck: So at what point, again, right, this is all for teaching purposes. I'm not trying to make you feel bad at all. Okay. So this is all just more for illustration, but what point do you realize like, okay, this is, I got to go the other direction because this is. This train is running towards not the right direction.
I want it to go and let me switch gear to do something different. Perhaps maybe use it for the good that I want to bring to the world rather than for the not so good in the world. You know what I mean?
[00:14:44] Andy: I mean, I understand. So, well, it turned out and there are certain names who I'm not going to use right now because I, I, But, you know, I'll, I'll allude to certain names because you know what transpired afterwards and what, uh, [00:15:00] danger I put myself in.
Um, so I'm going to be a little careful.
[00:15:05] ck: Um,
[00:15:05] Andy: it was when the, uh, the head of this big media conglomerate, uh, American Media, uh, I had lunch with him and, uh, he asked me if I was watching the, uh, debates, the Republican debates. Uh, for, uh, 2016, and I said, actually, I'm not really paying that much attention to it.
He goes, well, you see that Trump is winning, right? I go, yeah. He goes, well, um, I'm writing his debate material that he's collecting. Um, if you realize that these tabloids, um, dig up, or have forever, have been able to dig up all this dirt on anybody you want. And, uh, when he told me and reminded me or let me know that, um, Donald Trump was one of his [00:16:00] best friends and closest allies.
Um, uh, that, uh, the debate material was, was dirt that was being dug up by things like the National Enquirer,
[00:16:09] ck: okay?
[00:16:11] Andy: Uh, at which point I realized, oh shit. Um, this is the guy who I'm working with building this system and data, information, technological profiling, and how to influence and persuade people. To a scary degree, um, that this is who I'm building it within a joint venture.
Um, and he's clearly going to do anything in his power to get Trump elected. So what I was hoping to launch as a business for Democrats, Republicans, state, Congress, local, anything, uh, just weaponized one side of the equation. And to me, that's not democracy. Uh, once I just happened to be Democrats, the Democrats, right.
Um, we're still. Um, [00:17:00] using television as a marketing medium for politics and thinking that they, Oh yeah, we'll spend a little money here and there on Facebook. And when Trump's campaign and the guy who was executing on it was a guy named Brad Parscale. It was absolutely phenomenal at executing on this whole plan, uh, was in charge of getting complected.
And remember back in the day, everybody thought, Oh, Hillary's up by 10 points. It's not a chance in hell. I'm like, no, you're so wrong. You're so wrong. You don't understand. So I desperately try.
[00:17:35] ck: Wait, wait, wait, hold on. Back up one sentence. Yes, I think you said you saw wrong. You don't understand you have some insights that I think most people don't have.
So you're so wrong. You don't understand. So say a little bit more about that part.
[00:17:49] Andy: You don't understand how strong the weapon is. Communications weapon is that being deployed by the Trump organization.
[00:17:58] ck: Um, [00:18:00]
[00:18:01] Andy: you, you, you don't understand, right? Like you don't get it and you have to adapt this new methodology.
And when you do that, you get stuck with, um, all these huge consulting firms that the donors give money to, and then go spend the money for a kick, whatever, to be paid to put ad dollars onto television. Um, and, uh, So it was, I was screaming from the rooftops and I think
[00:18:27] ck: pause pause was for once again again this is not about politicking or and this is just data right I'm trying to understand the trends of this so again I'm kind of nerd out on data for just a bit so so what most people at the time thought it was the polling data and then that's based on TV or I think like Nielsen or whatever sampling thing that they do.
And what people didn't understand is the gorilla, the social media, like the more targeted things that, that not there that they can see. [00:19:00] And that's the mismatch of like, Hey, Hillary's winning. But when in fact, all this stuff are happening on the ground that they don't know about, is that what I'm hearing that that's where you're pointing to?
[00:19:11] Andy: Um, it's more than
[00:19:13] ck: that. It's more than that. Okay. I oversimplify, please explain.
[00:19:16] Andy: More than that. Um, save one side spend. 1, 000,
[00:19:22] ck: 000
[00:19:23] Andy: only 1 percent effective on digital, even in digital, but the other side is spending the same 1, 000, 000, uh, with this method, call it a methodology. And it's, you know, A hundred times more powerful, all right, a hundred times more powerful at, uh, persuading people, getting into their subconscious and programming them, programming people, turning people into avatars.
Um, so,
[00:19:55] ck: um, so what I'm trying to get to, Andy, again, not to get into too much [00:20:00] detail with the methodologies, but maybe there's some high level. So, like, what is the methodology of programming people? That's, that's what I'm trying to understand.
[00:20:10] Andy: You really want me to share this right now?
[00:20:12] ck: Yeah, I mean, if it's, it's confidential then don't do it.
It's all good.
[00:20:17] Andy: Listen, it's not because it's um, and, and the fear going forward is clearly, and I'll share it with you is when AI gets behind it, like, game over.
[00:20:26] ck: Okay.
[00:20:26] Andy: Not just with one side anymore. Okay. Um, what is it? And
[00:20:33] ck: by the way, the reason why I'm asking this question and not to interrupt you, keep, keep interrupting us, uh, is, is this, I think, I mean, we frankly don't understand the machinery behind the scenes and then I think, I'm pretty educated, but I, you know, I was in digital media, but I don't understand, you know, all the things that you're talking about.
Right. And so what I do know is maybe the, the, the business [00:21:00] model of media company, and then they're, they get more ratings and if they can, you know, evoke fear. And so that's, I don't really take media companies that seriously because I understand that's. Incentive inside fear, right? So that, so the kind of trickles down.
So we're, we're talking more in granular details here. Anyways, with that said, is that give you a little bit more time to think about?
[00:21:26] Andy: All right. I mean, I'm not sure how, how, uh, granular I should get. So stop me when I get too detailed. Imagine there's a magazine. Um, that has a couple million people who read it or go to it online.
Imagine if you all their email addresses because you get them to sign up for an email newsletter. Now you start sending them lifestyle email newsletters. And in the subject [00:22:00] line, you're tagging those subject lines around things that will give you Um, very, um, large signal of what matters to these people.
So say you have, you know, you, you send out an email a day and of those, let's say it's a million emails,
[00:22:21] ck: right? Um,
[00:22:21] Andy: there are 100, 50, 000 people. Who always open up an email about fear of immigrants crossing the border, right? And really bad fear like rape of your children taking your jobs Like real fear and they always open up on that subject line And then they also always they always click through on that content.
Okay quick question.
[00:22:50] ck: So just for the people who are listening Opening emails, click through, just indication of interest. So every time they do that, it just adds more scores to [00:23:00] that. Yes. Keep going.
[00:23:02] Andy: So if you're paying attention to what they're paying attention to, listening, right. Um, and we all know, I mean, I have a very crowded inbox in my emails,
[00:23:13] ck: right.
[00:23:14] Andy: But if I always open up on something that is around, you know, people are crossing the border, taking your jobs, going after your children, robbing you, all this stuff, that is a very large signal,
[00:23:26] ck: right,
[00:23:26] Andy: by paying attention to where they pay attention, right? And it's not just paying attention with your eyes, it's then triggering an emotion in your body that will actually move your hand to hit the click, which is a big deal.
Once you open the email and you read all this content, there's a link to a further story, then you move your hand again,
[00:23:46] ck: right,
[00:23:47] Andy: and click that. Now, if these people, 50, 000 of them do this, um, you know, 10 times over the course of two months around the same type of content, you know that these [00:24:00] people really are afraid of people crossing the border and coming after their family or taking their jobs.
[00:24:05] ck: That's
[00:24:07] Andy: number one. Yeah. Step number two, you take that 50, 000 and you upload it into, it's called Facebook or Google custom audiences. And very few people know this is you sign into Facebook or Instagram or Google or YouTube or any of them with one email address. But most people don't know is all the biggest data companies in the world sold to Facebook and to Google all of your other email addresses.
So if you have seven email addresses. You may have logged on with one, but they've got seven. So if I've got this cohort of 50, 000, I put it into custom audience. I can find most of those people on Facebook, Instagram, right? Google YouTube. And then these, the media buying [00:25:00] platforms allow you to do. Custom audience lookalike, which means what also people out now is all the biggest data companies in the world, such as Axiom, Experian, Epsilon, Oracle, sold all of your credit card data and your flight data and your whatever you buy data behind the scenes.
All of it. So everything about you. Your entire path of life, every decision you made to purchase anything under some socioeconomic condition, you had a kid, you didn't have a kid, whatever, right? It's all behind those scenes. So when you upload that 50, 000 into custom audience, you do custom audience lookalike, um, the algorithms behind the scene for you will now figure out what these 50, 000 have in common.
Across tens of thousands of data points and boil it down to what I, what I call behavioral DNA and then scope the rest of the digital universe, right, find people who have the exact same. [00:26:00] Footprint or digital or psychological behavioral DNA
[00:26:05] ck: and then
[00:26:06] Andy: would you like to target those people? Okay. Now when you start spending your money that way you're not wasting a nickel on spending Money targeting people that may or may not care about people crossing the border and being harmful to your family No, you're spending exact people and because It was the first time you could ever put video into live feed on the phone of these platforms now you can start showing video content to these people Not through email but on facebook on instagram on youtube.
Now, you're starting to surround sound them, right?
[00:26:48] ck: Omnipresence
[00:26:51] Andy: And then as you're, these new people you're surrounding, you're like, um, do you care about this issue? Sign a [00:27:00] petition, such as build a wall, right? Build a wall. Now what you're doing is you're collecting all, not the 50, 000, but maybe the 2 million people, but the exact 2 million people, their email addresses.
Okay. Okay, now you've got the exact target audience, the 2 million people who give a shit and are petrified of somebody crossing the border and doing something to their family. Now, you take them, the whole 2 million, Put it back into the custom audience of Facebook, which is also Instagram, custom audience of Google, which is also display, and search, and YouTube.
Now you start, now you start feeding back to them video and everything, back at them, around the exact same issues. It's, oh shit, they're coming after my family. And then all you gotta do is, um, start sharing those email lists, uh, with other platforms out there, um, such as, you know, fake, these pop up journalist sites.[00:28:00]
And now you just start reinforcing from many different angles, this truth of horror that's coming at you, right? Okay, from there, holy shit, you care so much about that, that all of a sudden, you know, it bursts, um, you know, the Proud Boys, or whomever, right? And then, you know, and then you just see clashes. So that's I hope that
[00:28:25] ck: so I really, I really, really appreciate the primer that we say about how to leverage data to polarize.
And, um, so when you say AI, then it's game over. Why did you say that?
[00:28:43] Andy: Well, because what the systems that I'm talking about that I help build,
[00:28:47] ck: right.
[00:28:48] Andy: Take human interaction such as pulling the levers on the media buys, but also curating the right content that will [00:29:00] put somebody into a state of fear and then anger.
Um, um, I just the level of data that it can process the speed at which it can process and now generative AI, you can store the entire world's imagery, um, and play and surround sound you with it, um, to your emotions, right? And it can also create, create experiences. Thank you. The deep fakes, so whatever,
The images that we saw coming out of the Middle East over the past, you know, 10 days or so un on, whether it's in the West Bank or Gaza, or, or if it's inside of Israel.
These are horrific, horrific, horrific images, surround sound, people on either side that could be escalated to anger and hate.
[00:29:55] ck: Um,
[00:29:57] Andy: and then I can just generate [00:30:00] through deep fakes with whatever images will, um, agitate you the most. And we have feedback loops on it because it agitates you the most. You'll watch the video for longer, which it can measure.
Uh, you'll, you'll, you'll click or forward, which it can measure. So now we can measure exactly. with the feedback loop, right? Show you this action here, show you more of this action here, enhance the imagery, right? To completely drive you to a state of You know, um, whatever, you know, the deepest state of belief system and fear and anger, and they're coming for you.
Yeah.
[00:30:40] ck: So, so the current, I didn't even think that we're going to get here, but since we're on this topic, I'm just curious, more like forward looking type stuff, right?
[00:30:50] Andy: You're keeping us on this topic.
[00:30:52] ck: No, no, this is great. I've, I, that's why I love podcasts, because you never know where the conversation goes, and here we are.
Great. [00:31:00] So, one of the things that people say a lot is don't hate the player, hate the game. Right. So I don't think individual players in the media business is they're evil, bad people per se, but the entire game is structured around, Hey, we get paid when we get attention. And therefore, what do we get the most attention?
Guess what? Polarizing viewpoints. The more polarized people get, the more trigger they get, the more attention they pay. I think the famous quote was, Hey, um, Howard Stern. Fans watched, you know, Howard Stern X number of hours. People who hate Howard Stern watched double that amount. So it's kind of like that, right?
So now the whole system is rigged in that way and AI is weaponizing that. So more and more people are going to get polarized. So that's not a good place to be because that means more disability, more wars, more friction, more conflicts, more bigotry, all those things. Are there ways for us to bring [00:32:00] us back to more neutral, meaning equanimity, groundedness, using the same mechanisms or changing the game in such a way that can bring us back to that using the same mechanisms that we have?
[00:32:15] Andy: Yes. Yes.
[00:32:17] ck: Uh huh. Um,
[00:32:19] Andy: I started saying don't hate the player, hate the game probably about seven years ago because I was, I was pretty angry, right, at certain people who, Weren't doing anything about this and I knew better and then I started studying the game and You know one issue which I think is a big problem I think it could be easily addressed is the fiduciary responsibility of a board of directors or a CEO is purely to the shareholders 100 percent like that is your job And if you're a board member like you can get sued for not doing that if you're a CEO you can get sued for not Doing that well Um, [00:33:00] that's like such a simple thing, right?
Change fiduciary responsibility, um, to not just, uh, your shareholders, but to society, or at least to your own consumers, right? You know, to be able to, I mean, that's just an easy change to the capitalist systems.
[00:33:20] ck: Wait, wait, what does that mean though? Like, does that mean that, you know, I don't know, like society gets a percentage of the equity or voting power.
Is that, I don't understand what that means.
[00:33:31] Andy: Oh, sure. Sorry. Um, yeah. Um, so, uh, you're, you're, you're paid to be on a board of directors
[00:33:40] ck: or you're
[00:33:41] Andy: not a board of directors. Um, and the board of directors makes the decisions of the company. The CEO reports to the board of directors, right. And today, you know.
Your only job, and it's in the law, your job is to make more money for the shareholders.
[00:33:57] ck: Yep.
[00:33:58] Andy: Period. [00:34:00] But at the expense of what, right?
[00:34:04] ck: Everything else, yeah.
[00:34:05] Andy: So if you just change that one thing,
[00:34:08] ck: right?
[00:34:09] Andy: So your fiduciary, your responsibility isn't just a 100 percent to make more money for the shareholders, and you just made it a legal thing.
You also have a legal responsibility to not harm. I guess that easy. It's it would be that simple. Um, and and with that, you know, I do believe that the biggest tech players Would say, Oh, we can't do that, but we're living in, like, clearly a mental health crisis. Suicide rate for teens is, like, grown, I think, by, like, 30%, um, since the advent of, you know, social media on our devices.
That's crazy. That's insane. But there's no, but the CEO and, you know, the board of directors [00:35:00] say, I'm, I'm guessing, They would be thrilled if somebody, if there was a law that said you're for your, your only job, the only thing that you're measured on isn't just how much money you make for your shareholders, but you also have a greater responsibility of do no harm.
[00:35:19] ck: Are there any kind of, I don't know, there's, there's like B Corp, there's all these other sort of organizational. I don't know, agreements or something like that. People volunteering and send themselves up to do. Are there anything like that where you start to get a group of companies or cohorts together and say, yeah, I agree with you, Andy, we're not, we're going to agree to do no harm, anything like that?
[00:35:48] Andy: I don't think so. And not in the bylaws of a company, bylaws of a company. And, you know, clearly one of the issues with doing that is, um, [00:36:00] Share money. Job and people who manage money is to make more money. Period. People who manage money. It's their job to make more money. Okay, so if you could invest in a company who's solely focused on making money or the competitor who says sure make money But don't do no harm Which in this economy, it'd be like, Oh, you know, we're not going to fuck up the kids anymore, but it'll make less money.
The money people whose job it is to make more money. We'll invest in that company.
[00:36:30] ck: And
[00:36:32] Andy: when that happens, like, Oh shit, less money for this company.
[00:36:35] ck: That's right.
[00:36:37] Andy: Goes under. So we're in this competitive world. So if it's not like a legally binding thing, like there is nothing in our constitution that says, uh, the responsibilities of business is just to make more money.
Why isn't there something? I mean, it's not hard globally as part of capitalism. And it could be a beautiful thing, but then we [00:37:00] get into the issue of, oh shit, if we're not just, if we're not doing it, and particularly in the world of AI, then some other country, China, probably, right, is going to be better than us at it, and have the ability to hold the attention more of the world.
And therefore, beat us in this economic war that we're in. So that's kind of like the global issue. Um, when you say don't hate the player, hate the game. I think people have to get over this thing of, you know, nationalism. And realize that like, wait a second. We're one humanity. One planet. And we're in a period of time.
It's like, either, I mean, it's gotta be all for one. Otherwise, it's like all for none. Like, we all lose.
[00:37:49] ck: Yeah, well, the same as Buckminster Fuller said, Something along the line of, The planet is our spaceship. Why would you trash your own spaceship? [00:38:00]
[00:38:00] Andy: But also, the planet is our spaceship. Why would you trash your own spaceship?
And, if you think that, you know, there are going to be a whole lot of have nots in the world, and that's going to be a good thing for the haves, sorry. You just got to study every revolution
[00:38:19] ck: that's
[00:38:19] Andy: ever happened. Like, the haves are going to be the first one that they're gunning for.
[00:38:24] ck: That's right.
[00:38:25] Andy: Like the first one, French Revolution, you name it.
[00:38:28] ck: That's right. Especially when the world is polarized.
[00:38:32] Andy: You got it. That's, I mean, when it's
[00:38:35] ck: peacetime, everyone's happy, prosperous, you know, no problem. And it's a lot
[00:38:41] Andy: more have nots.
[00:38:42] ck: Yeah,
[00:38:43] Andy: and this polarization is doable like so it's like my kids aren't safe. Okay, I can provide for my kids 90 percent of the world is like that wicked conflict.
My kids aren't gonna have a good life.
[00:38:59] ck: [00:39:00] Mm hmm.
[00:39:00] Andy: They're just not it doesn't work that way It can't work that way.
[00:39:07] ck: Yeah,
[00:39:08] Andy: don't go after the house
[00:39:10] ck: Yep. Okay. So thanks for sharing this, by the way. I appreciate the deep dive into 2014 Story,
[00:39:20] Andy: I appreciate that deep dive. Let me do it breathing and stretching.
[00:39:24] ck: Yeah, seriously. Let's let's now We do
[00:39:26] Andy: about it.
[00:39:29] ck: I mean what a great motivator to Dedicate your life towards this new path that you're on right? Yeah, because let's let's talk about it Frequency is medicine, you know, so tell us a little bit about how did that concept come about frequency is medicine.
[00:39:48] Andy: Oh sure so I've first of all, I'm on the the board of child analysis psychiatry department of Mount Sinai But on the leadership council [00:40:00] of near stem cell research foundation, which is the larding Largest, uh, research foundation around what can be done with your skin cells, reverse engineering them into brain cells, right?
So I've been, and my undergrad degree at Cornell, uh, was abnormal psych, which is basically an early generation of behavioral economics. So how can you, uh, influence somebody to make decisions that are not in their best interest? So, uh, I've been back up
[00:40:32] ck: influence people who what one more time decisions
[00:40:37] Andy: That are not in their best interest
[00:40:41] ck: Okay, say more about that.
I don't understand one more time I still don't understand. Could you say that again in a different way, please?
[00:40:48] Andy: Okay, sure. Um.
[00:40:51] ck: What would be an example of that?
[00:40:54] Andy: Sure, so there were, there were all these studies that were done in the 50s. Um, one, I believe it was either [00:41:00] Stanford or Berkeley, where you took, uh, um, a, a focus group, right?
And you cut it in half, and, not a focus group, but a testing group, a research study.
[00:41:13] ck: And you
[00:41:14] Andy: went to a prison, and they put half of these students inside the prison as prisoners,
[00:41:20] ck: and they put
[00:41:21] Andy: half of them as guards. This is pretty famous stuff.
[00:41:24] ck: And
[00:41:25] Andy: within, like, a couple of days, The guards became extremely abusive prisoners, even though they knew they were just playing a role, right?
That's an example. There's also one called the Skinner box, uh, which was a study done. I don't even know that maybe the sixties where you'd bring in two, two people that they said were part of this study. And one of them, they would put behind a glass wall and they'd attach them to electrodes. Um, well, he happened to be an actor.
The other one was the one who you're actually doing the study on. [00:42:00] And you'd. The person who was doing the study had to ask that person some questions. If they got the question wrong, you were to hit a button that would shoot electricity through them. And the, the person who was being studied kept doing it.
Even when the researcher said, turn it up, turn it up, turn it up. So, like, what are humans capable of doing? Now, things that are not. In your best interest. I don't know if you drink, but like, I can tell the difference between a great goose or cat or, uh, not kettle. Tito's one's 20. One's 80
[00:42:43] ck: or 60.
[00:42:44] Andy: God's name.
Would somebody spend triple the amount, right? Here's another, um, you know, you can go and buy a off label [00:43:00] Polo Ralph Lauren shirt, the exact same fabric from the exact same factory. The exact same size, the exact same color, one, twelve dollars. The other one has a little pony on it, that's sixty dollars.
That makes zero economic sense, like, none. So how do you, you know, it's, it's, if you understand that, um, along with the mechanisms which I just described, okay, um, you can get people to do things that are economically irrational, make no sense.
[00:43:42] ck: I see. Okay.
[00:43:43] Andy: One of my, I guess a teacher of sorts, Daniel Kahneman, who won a Nobel Prize in this thing called behavioral economics.
He won the Nobel Prize in economics, but based on cognitive behavior. He said to me, it is, said it's, [00:44:00] um, it's extremely unlikely That a human being has a capacity to be rational. What? But, listen, this is just, this is just science. So, yeah. So, you know, the, the, the study of the discipline of behavioral economics, combined with the communications that we have available, um, it's not that hard to get people to do stuff that's not in their best interest.
And now with AI behind it, it's It's, it's literally controlling people, controlling, even though we think we walk around with free will, which we do, however, you have to be really aware of how you're being influenced.
[00:44:50] ck: Yeah, but it's so subtle. Yeah. I can't remember exactly what movie or whatever people basically drop a little hint and eventually over [00:45:00] time you make certain decision, pick certain, you know, cars out of a thing.
It's, it's been shown as a, as a entertainment before, so. I
[00:45:09] Andy: mean, I'll, I'll give you a fun little thing, which is, was another study that was done. It's called, it's called the anchoring. So at University of Chicago, they brought in a whole bunch of people for this study. And the people who came through had no idea what they were being asked.
But it was a series of things you had to do. Um, when you first came in, there was a roulette wheel. And they just spin the roulette wheel. And they just spin the roulette wheel. And it would land on a number. Um, well, the roulette wheel was rigged, right, to the number 7 and the number, I think it's 35, something like that, right?
Either 7 or 35. Then you go to the next thing, whatever you're being tested on, and then they ask you a question. How many countries are there in Africa? For those where it landed on the number 7, they would [00:46:00] guess, because very few people know how many countries are in Africa, they'd be like, ah, 6, 7, 8, 9. If it landed on the number 27, people would answer something like, you know, 26, 27, 20, 29.
So you can anchor things into people's heads and you've no idea what's happening. Like none.
[00:46:21] ck: Great. So the original question was, um, explain the concept of frequency as medicine and, and how did you get there and more details, how many benefits individuals mental fitness.
[00:46:35] Andy: Sure. Um, so, um, the science has now proven and hospitals are working on and agree with that meditation extremely beneficial for quieting our minds.
And if you quiet your mind and get you down from like a, an alpha beta state of think from beta to alpha, then to a theta wavelength, it can literally calm down your nervous system. [00:47:00] And as you calm down your mind and your nervous system, it will interact with your adrenal glands, which will either adrenaline cortisol, which are like The fight or flight ones, or it will actually give you estrogen or, or testosterone, which is like, okay, we're in a good state.
Go and reproduce. Don't worry. Nobody's coming at you. So meditation is extremely beneficial. However, meditation is hard and it takes, you know, a Weeks, months, if not years of practice to actually be able to quiet your brain from all the thoughts that are, you're talking to yourself all the time, to be able to settle down, to get the positive benefits, the mental health benefits, right, of being able to meditate.
It's just hard, and in today's world, who's got that much time, wherever, and if you're stressed out, And have anxiety and you tell somebody go meditate and they can't meditate just elevate their level of frustration because they
[00:47:53] ck: can't even more. Yeah.
[00:47:56] Andy: So, um, probably about seven years ago. I went to [00:48:00] my first it's called a sound bath Uh where you go to a yoga room whatnot lay down And these musicians, uh, work, play certain instruments, Tibetan singing bowls, um, didgeridoo, which is, you know, an aboriginal, um, instrument, crystal balls, bowls, excuse me, um, uh, gongs, um, And, uh, the people who go there by just laying down and being quiet, closing your eyes and listening are getting the same benefits that one would get after becoming very good at meditation.
So when I realized this in my career, Wow. Um, but how do we, because one thing it's easy to, it's way easier, takes less time and less frustration to get the benefit of meditation if you're doing it with, uh, this type of, um, sound bath. [00:49:00] So then I just started thinking about, um, can we digitize it and get it out, democratize it and get it out to certain people, to everybody.
Everybody. So kind of like, so that's what, uh, uh, I met this really great guy when I was, I went to Kauai for, uh, COVID, uh, which is lucky and fortunate and amazing. And I met this incredible guy who, uh, would and some of his friends who were really into, you know, sound healing. Um, and the guy asked me if I wanted to get involved with his new company, uh, to actually be able to bring and democratize access to, uh, sound baths.
And sound healing out to the world. Uh, so I agreed about a year and a half ago, I was the founding CEO of this company and, uh, we built the most incredible, um, TV recording and sound recording studio, uh, in [00:50:00] Santa Monica. Um, over a million dollars of just engineering work. To be able to capture the, the pure quality of the sound in, uh, surround sound, spatial audio, binaural, right?
Uh, and, uh, at this point in time, we have, uh, had, I think about 12 of the leading sound healing practitioners from around the world. Uh, come in, record in our studio, and we put together a full library, uh, with audio and visual, uh, in the highest fidelity that technology could possibly do in this moment in time of Sonics, uh, and then, uh, we built an app for it.
It's called five set. I think that's the first time I've ever said it to anybody. Uh, on, on, uh,
[00:50:49] ck: Oh, thank you for sharing that.
[00:50:52] Andy: Yeah, we haven't lost, we haven't even launched it yet.
[00:50:54] ck: Um,
[00:50:55] Andy: it'll, it'll be launched in apps or probably in about three weeks, but with [00:51:00] it, it is, uh, you know, a full library of all these different, uh, instruments as well as sound healers.
Brought down to the individual and the goal is to then, um, hopefully partner with Apple with all the biofeedback devices to optimize Well first we're just going to be studying people's behavior to give them a personalized feed of what actually works for them Makes them feel good, right? But that's kind of before you go into
[00:51:25] ck: the future before you go into the future Andy Let me ask you some key questions about the construction because you mentioned about it took you You know, a million dollars in spatial and engineering and sound design and all these things to, to, to, to really capture the sound healing.
For the layman who have no idea about, you know, why is it so hard to capture the sounds of gongs or sound bowls or things like that, maybe you can speak a little bit about it. So that way people can really appreciate, right, why, you know, all the work and investment [00:52:00] that you guys put in to capture it.
[00:52:02] Andy: Yeah. Thank you. So each of these instruments, um, uh, put up sounds that have certain frequencies, high or low, that don't get captured by my almost any microphone. Okay. Uh, the full spectrum of those sounds is critically important and one of the reasons why it works so well in person in these sound baths.
Um, so that's, that's one thing. Which is, was, was hard, hard engineering task. Uh, the second thing is how do you deli deliver spatial audio? So when you close your eyes, you actually feel like you're there.
[00:52:43] ck: You know, you're in the middle of it, versus it sounds more somewhere.
[00:52:49] Andy: Well, there's stereo, so you're in here or here.
But what, how do you make sound appear like it's literally over your head or behind you?
[00:52:58] ck: Right.
[00:52:59] Andy: Surround [00:53:00] sounding. So, um. Our, our, our practitioners, which we call maestros, um, work with, we've, I think, 36 different mics in our studio and a bunch of binaural mics in our studio, which is a way of capturing sound very specifically, um, will either be in place working with the gong or flute or didgeridoo.
But at times they'll come up around certain mics. Uh, and, uh, so you, when your eyes are closed, each individual who's listening, it feels like you're getting a personalized sound bath by one of the best sound practitioners in the world. Right. And, and you, I mean, the people that I've experienced it, um, it's so much fun because sometimes you'll watch them open their eyes and like look behind them because they really think someone's there.
Right. Um, and, uh, you know, I've, the people who have, [00:54:00] uh, we've shared this with, uh, there's something called audiophiles, which people are just like, we could be obsessed with audio. Um, they're just shocked by how we've been able to not just record it and capture it, right? But then each time that you turn, Acoustic, right?
Live acoustic into digital. There's, it distorts the quality of the sound waves.
[00:54:25] ck: Um,
[00:54:27] Andy: we had to put together a system that did that conversion. I mean, once, but not more, right? Um, so that it wouldn't distort, uh, the quality, not the quality, the true essence, essence of, of the sound waves. Right. Um, and then having to be able to store it,
[00:54:50] ck: um,
[00:54:50] Andy: compression, uh, and then to be able to stream it without compression and distortion to literally get down to your over the head, [00:55:00] Headphones off of literally your handheld device, um, was, was really hard.
Um, and, you know, we, we had some of the leading advisors in, you know, sound engineering and acoustics and whatnot. To not just build the studio, but then to figure out the technology of the capture, the storage, and the delivery.
[00:55:23] ck: Got it. Even, even the signal processing aspect is so important. I got it. So, so music is one of those things.
Well, number one, it's how do you, how do you Even calibrate, you know, there's no rankings. There's no like the top 40, you know, top most popular sound healer in the world. Nothing like that. So one, how do you, how do you, how do you even do that?
[00:55:53] Andy: Um, word of mouth.
[00:55:55] ck: I see.
[00:55:56] Andy: Yeah, word of mouth. So there's certain people, um, [00:56:00] um, that, uh, when you speak to like 10 or 15, let's just call them really popular sound healers, and they're all pointing their direction to someone as the best.
[00:56:15] ck: Mm.
[00:56:17] Andy: Um, and, you know, there's a guy named Mach, no, not Machu, uh, I'm forgetting his name right now, who's one of the, he's probably the best in the world, and he's from Nepal, at a specific wooden flute,
[00:56:32] ck: right?
[00:56:33] Andy: Um, and because we have a lot of friends in the neuropsych space, as well as the sound healing space and the health and wellness space.
Now we've been able to convince some of those people to come to the U S and to our studio, uh, to record and to come back often as, you know, some people like resident DJs, we, we have resident maestros, right. Uh, who are doing that. And then, you [00:57:00] know, the fun part, uh, which I'm, can I talk about the future yet?
Or can we speak to you on it? Yeah, please.
[00:57:05] ck: Of course. Of course. Go for
[00:57:08] Andy: it. Is, is the biofeedback.
[00:57:10] ck: Just see, oh yeah, I was going to ask you about that because, uh, my experience of sound healing is improvisational. Rarely they have like a, like a, like a, like a set list, right? It's not, there's no scores. Usually just kind of think about it in a moment.
And so usually these type of performances one and done. So I'm so glad you're able to capture it. But, uh, but it's also so personalized. How do you make sure that they not just. You don't prefer the sound, but actually getting the healing. You know what I mean?
[00:57:46] Andy: I know exactly what you mean. Thank you for asking that question.
Um, so yes, if they feel better, fantastic. And if our algorithm behind the scenes is recommending the best [00:58:00] stuff for them and they're feeling better. even more fantastic. I mean, that's really all we care about is people feeling better and being more the state of joy and love and comfort in their own skin and then for spreading them.
But partnerships with different biofeedback companies, you know, measuring your heart variability. measuring your oxygen level, uh, measuring your brain wave states, right? We, we had, there's these devices to partner up with. So with that feedback loop, we can actually learn, um, what actually affects you, you personally, uh, your brain states, what brings you, you personally into a state of theta.
And at what time of the day does one thing bring you into state of theta versus somewhere, something else. And then maybe you want to get into like a state of gamma waves, which is really high productivity. Uh, what will work, what will work for you? [00:59:00] And instead of just us guessing, no, it's a scientific method down to an individual, right?
Let's measure the results of how your body and brain are reacting to certain signals. And then, you know, I don't even know if we're going to do this, but a long term, at least for me, uh, is to get FDA approval for this stuff.
[00:59:20] ck: Oh, that's right. I was going to ask you that follow up question anyway. Yeah. What are some of the potential hurdles that you have been, well, number one, why do you even want to get FDA approval?
One. Two. If you do, what are the potential benefits and then what are the hurdles of turning sound or frequency into medicine?
[00:59:40] Andy: Yeah. So, first of all, um, luckily, uh, we're not the first ones. Uh, some, there was a trailblazer, there is a trailblazer, and it's a guy named Adam Gazzelli. Okay. Uh, and he runs a, uh, a lab.
At the University of California, San Francisco, which is called [01:00:00] Neuroscape. And out of his lab, uh, he developed a video game, which is, if you think about what video is, it's just our environment to light waves and sound waves. Just frequency, light waves and sound waves. And he's done the clinical trials. Uh, for ADHD.
Clinical trials for ADHD and the FDA approved his video game, um, as something that would be as effective as feeding your kids Adderall. Holy shit. And he was the first ever to get an FDA approval for a digital, otherwise known as frequency, light waves and sound wave products. To help with a mental health issue.
[01:00:47] ck: Well, why do you even want to do that? I mean, because obviously there's a lot of hurdles, difficulties, investment to get to jump through the hoops, right? Well, why do you even want to do that?
[01:00:57] Andy: Because no, [01:01:00] I'm not sure we do, but if we did, and I do, um, it's because there's a whole Population out there that would probably think this is woo woo.
Mm. Um, just like, okay, what? I'm gonna meditate and my problems are gonna go away, or I'm gonna listen to some sounds and it's really gonna help me with my sleep, or my anxiety or my depression or whatnot. Or it's gonna snap my kid out of a, a, a state of trauma. Right? No pe-people like validation.
[01:01:36] ck: Mm-Hmm. .
[01:01:37] Andy: Um, and the biggest validator out there currently in today's world around medicine is the FDA.
Um, yeah, do we need it? Absolutely not. Because there's no FDA requirements around sound as medicine. There probably should be. There probably should also be. Ooh, good idea. Wow, this is an interesting idea. FDA [01:02:00]approval around media.
[01:02:03] ck: Yeah,
[01:02:04] Andy: pretty good idea. Um, yeah, think about what I was saying before. I do know.
I know.
[01:02:09] ck: Yeah, for sure. Uh, I mean, Because there
[01:02:12] Andy: is precedent that, because of Adam Gazzata, there's precedent that light and sound frequency media is medicine, because they approved it for ADHD, so maybe that's actually true.
[01:02:28] ck: I'm just thinking about the the antagonisms of those who, you know, it's going to hurt their pocketbook.
So they're like, no, that's what I think about right away. So very,
[01:02:39] Andy: very, very fair.
[01:02:41] ck: Yeah.
[01:02:42] Andy: Extremely fair. And I've gone up against those giants in the media world, the political world, and lots of different powerful worlds. And it is the reason why it is the reason why it would probably be foolish of us To even get on the radar of the pharmaceutical industry.[01:03:00]
Because, oh shit, if you can put this on your head and like, you don't need to take a bunch of pills? That's gonna hurt somebody's pocketbook.
[01:03:09] ck: Yeah, I mean, I think, I mean, we can argue why FDA, right? FDA is good because, you know, as you said, it adds legitimacy, validation, credence to the thing. Yeah, well, but there are many many other things where you can add as credence now now if you say all right We're gonna mark it as a drug, you know something in medical insurance money or whatever.
Okay, that makes sense And but if it's just purely for the consumer, there are many many things that people buy without Without needing going through the rigors
[01:03:45] Andy: I, I so agree with you and it's not needed because like the cost of the app, like it does not going to be needed to be covered by insurance.
Right. Um, but I also do want to help all the, these alternative, um, health practitioners, which are [01:04:00] called now alternative, like sound healers, right. With their validation,
[01:04:04] ck: um,
[01:04:05] Andy: because some of this stuff is actually expensive, very expensive to get a personalized sound healing done, like extremely expensive.
So if insurance covered that, that'd be tremendous. Like absolutely tremendous. Um, all the, I think insurance now covers acupuncture. Is that true?
[01:04:24] ck: I don't know, off the top of my head. I don't know. I think so. Cause I, I may have used it. I don't know. It's been a while.
[01:04:33] Andy: Yeah, it's probably a bad idea for us to do it, but you just mentioned, yeah, I mean,
[01:04:40] ck: I'm just, you know, thinking about it.
So, you know, I'm not making any suggestion one way or the other.
[01:04:46] Andy: I trust you. I think about it quite a bit and like, well, it's just, it's just to help as many people as possible.
[01:04:54] ck: I mean, the thing that I think about, right, just riffing here off the top of my head, [01:05:00] you can effectively be the validation lab yourself, right?
Since you're. Having the equipment of doing all the engineering stuff, you're having probably access to all the technology measurement tech as well, HRV, brainwave, oxygen level. You can effectively probably post, publish your own, you know, collected paper, and that would be validation enough, right, if you publish certain things.
Either with university partners or by yourself to the scholarly world that will add the credence that you need, no?
[01:05:36] Andy: All on the roadmap. Exactly. I mean, all on the roadmap, right?
[01:05:40] ck: Yeah.
[01:05:41] Andy: You know, just get a few hundred people into a room linked up to all these biofeedback mechanisms.
[01:05:48] ck: Yeah.
[01:05:49] Andy: And just get case studies, right?
With close observation in the scientific community. That's all, that's all in the robot.
[01:05:57] ck: Yeah, I mean, [01:06:00] like I said, pharmaceutical, no, not pharmaceutical, nutraceutical companies have sold lots of things without the rigor of the scientific study. Now they're trying to get more rigorous, but, you know. By
[01:06:13] Andy: the way, you touched on something before, which is, uh, around, you know, certain sounds will, Do certain things to certain people, right?
And the easiest way for me to explain this to people who are not familiar with a sound method, what I'm talking about is, you know, there are certain songs that each of us listen to and give us energy. Might not give somebody else energy. They might hear it as noise. Um, there are certain songs for us.
Might bore you to tears, but make, might somebody else feel love and romantic. So, how do we do, how do we customize sound medicine?
[01:06:51] ck: Oh, that's right. Yes. I wanted to ask you about, um, preference versus response. And then how do you, you know, basically create like a [01:07:00] Pandora of sound healing effectively. Based on my response, you give me something.
[01:07:04] Andy: That's, that's easy. That's already in our algorithm. Just by studying what you pay attention to, what you like, what you like. Which sound healer you come back to more frequently. Do you listen to it like for one third of it or all the way through? How often do you come back or what instrument with time of day?
That's all baked into our algorithm, right? Um, so, but that's beautiful, but people like to see results on something. So with biofeedback devices and actually plugging into Apple health, it's actually seeing what it's doing to your biometrics, your biomarkers, right? Um, Now we're starting to get, like, hardcore.
Now you're starting to get into, not just recommended, but prescription based, algorithmic based, on your biology. [01:08:00] What sounds to use, when you're in what state of being, right? So if you're, if you're studying your own heart rate, your own variable heart rate, you're studying your oxygen level, your blood pressure, and your brain waves, brain wave states.
And you say, Hmm, I'm really jacked up. I need to chill the hell out because I gotta go to sleep in like a half an hour. If you're connected to these devices and you have been fed certain content that when you have these different biological states, Just for you specifically, take this medicine, right, these frequencies, these instruments, this length of time, this mood, this whatever, and it'll drop you back down because we've seen it time and time again in the patterns of studying your biometrics.
[01:08:56] ck: Ooh, here's a cool thing. [01:09:00] I don't know if you know, um, oh man, how do I describe this? I don't know. You can record human voice. And then you can produce certain scale and then you can use that scale to make a a song. You can basically play like a piano. The human voice will come out. Yeah. Right. So what if you can do the same with the certain sound patterns, maybe the flu to maybe the Sound Bowl or maybe the did do or whatever, certain sound frequencies.
And based on that, then you can essentially use AI generated composition based on this feedback. So you can.
[01:09:33] Andy: I'm okay with it. I generated. You don't even need AI. You need machine learning.
[01:09:38] ck: Okay.
[01:09:39] Andy: Um, to, to study this amount of data to then be able to compile which instruments put together in what sequence over what amount of time, what volumes, um, I do not want to use a fundamentally opposed to using, [01:10:00] uh, machine generated sounds.
[01:10:03] ck: Great.
[01:10:06] Andy: Sure. Um, well, first of all, I don't know your body all that well. Uh, how, uh, because I'm about to hop from science mode and whatever into more like consciousness mode.
[01:10:18] ck: Perfect. Perfect. Our audience is us the same.
[01:10:23] Andy: Okay, cool. Um, so, um, well, I'll start with, with like the physics, quantum physics world.
Which is no shit. We're surrounded by frequency everywhere.
Or people. Our maestros,
they're sitting in a space and time in three dimensional reality floating in a sea of frequency. These people are tapped in and through their bodies, you're right. [01:11:00] It is not, they don't play a song. No, it just comes through them. They play those right at the right time. There's a pattern to it. And your brain gets, your brain can predict what comes next.
If you're listening to words. Your brain gets used to those words and know what comes next. So it's very easy to just, your brain will get distracted and maybe even start singing along. Or, uh, when you know what comes next, it's very hard to, um, calm that brain and from being distracted from, uh, altered voices or memories that might pop in our heads.
So because it appears to be a random playing of different instruments, By this tapped in human, which is just an instrument itself, being played by the frequencies that's surrounding them, [01:12:00] right, um, into, and there's something very important about these instruments, um, and, and they're very, you know, all from an indigenous place.
This is in cultures from around the world. Uh, they're all, all, all like. Uh, yeah. Pretty much. It's so much love and attention. If you understand, like, the history of matter, matter and energy. Um, each of these instruments are just condensed energy or [01:13:00] wavelengths. All right.
So when you're actually. Actually having a human downloading, that's a little bit of a term, but like surrounded by frequency and being played as an instrument, playing the actual instrument,
which also has so much information gathered in them and to be able to capture that is, is, is really important. Raw and true to biology on biology
in our brains. Um, AI doing the same stuff? [01:14:00] Man, I just can't. Let's just say there's no soul in it.
[01:14:06] ck: Got it.
[01:14:08] Andy: I believe that's very important. Why is that?
[01:14:10] ck: I don't know. I
[01:14:12] Andy: mean, what is,
[01:14:13] ck: you know, And by the way, I'm agreeing with you. I'm just
[01:14:17] Andy: asking. Um,
So we're going to be right soon. I don't know. Uh, Uh, Uh, Um, and I've seen too many times, media, poison people, poison people. So I go back to, I guess, original medicine of humans. And that's what we're just, because of the digital world, we've just been able to, um, [01:15:00] do it to the best as we can. Digital distribution, bring that love down to the human, to the best we can, right?
Um, to, uh, make people feel loved, supported, um, calmer, self regulated, bathing people in love. Um, so, it's just, it's just core to my belief system. Can't prove why, or why not?
[01:15:28] ck: No, I mean, there's, I mean, At the end of the day, Andy, um, why do we do what we do? You know, some people use the five, why method? Why, why, why, why, why, to try to get to the truth?
Some people use the what method to, you know, what's important about this? What's important about this? And just keep asking to get to do this truth. And some people use religion to get to do a truth. I mean, at the end of the day, why it's irrational. We [01:16:00] just, because it's meaningful, right? If it's meaningful for you, You know, I don't, I'm, I'm merely asking the question.
Why? So then we understand where you're coming from. I'm not, you know, debating with you or anything like that.
[01:16:13] Andy: No, I appreciate that. It's really good questions. And I love how you framed it that way. And because it's my truth.
[01:16:20] ck: Yeah. And that was great. And.
Much to discuss there. We'll find out soon enough. I think that this is me forward for Projecting for looking. Yeah, people have a really intimate relationship with a phone already. One may say they love their phone Even use that word. Now, wait until the phone responds back to you in kind with a human voice or, you know, some anticipation of what you like and saying things that you like, much like the movie Her.
Or, um, [01:17:00] I think there's a movie that I watched on Netflix, it's called Better Than Us, or something along that line. So it was very much in the camp of what you just said, a humanist. We can have a genuine relationship with a digital being. Watched a movie and I'm like, hmm, maybe I changed my mind But we'll find out because you know, it's still theory
[01:17:25] Andy: It's actually not that theory right now theoretical.
[01:17:28] ck: Okay.
[01:17:29] Andy: Are you familiar with with Tristan Harris and Aiza Askin?
[01:17:33] ck: I know Tristan Harris, uh, not personally, but I know of him, and then the second name, I don't know.
[01:17:39] Andy: Yeah, so, Tristan Harrison, he's asking, they're the guys behind the movie, The Sociable Left.
[01:17:43] ck: Yep.
[01:17:44] Andy: Um, they, they are co founders of a non for profit called the Center for Humane Technology.
[01:17:50] ck: Okay.
[01:17:51] Andy: Uh, they've been tapped by, uh, The leaders in the AI space from our biggest, uh, companies, uh, media companies, social [01:18:00] media companies, digital companies in the U S and global and government and national security folks to figure out what the fuck do we do? Excuse my language. I guess it's okay. It came through me.
Uh, what, what do we do about this emerging dilemma of AI? So you can go on YouTube and watch yourself. It's called the AI dilemma. I'm not sure if they spoke about this, but I was briefed on certain things. I believe it's in China already, uh, where people are having love relationships with a, uh, Right? As in best friend, most trusted relationship is with AI.
And that's my greatest fear. It's my greatest fear because of how we began this conversation, which is really about who do you trust and how easy it is to manipulate.
[01:18:48] ck: Um,
[01:18:49] Andy: so it's an artificial being, right? That knows more about you than you even know about yourself. Possibly even remember about yourself.
[01:18:59] ck: [01:19:00] I think, actually, let me, let me actually again do a gentle push back there. How do I articulate this? I don't think AI inherently is bad. It is the amplifier of the creator's intent. So if the creator intended it to do certain things, like a weapon, for example, the weapon's intention is to harm, and then therefore you can use it to harm, or use it to protect, or whatever.
articulation that you have. So it's, so when they, when they, let's see, how do I say this? It's a little muddling in my, in my thought, but, but I think it depends on the, the creator's intent and the user's intent. And then, and then obviously the gamification of the entire system. And that also is going to move people's behavior one way or another.
So anyways, what did you get from when I deliver or try to deliver?
[01:19:57] Andy: No, I, I, it landed. It landed. [01:20:00] Um, I, I think what you were unintentionally doing, which I'm, this is kind of a side joke, is, um, relieving my pain and suffering from the earlier part of those conversations. I just built a thing and the users who actually went and then used the thing was out of the control of the original creator.
I know that's not what you intended. Uh, what you're saying is, And I agree with you wholeheartedly. Um, AI could be the greatest equalizer that has ever happened on this planet. And it could literally lead to the most beautiful life for my kids. The most incredibly beautiful life of having to put less effort into this thing called being able to feed their family and take care of their family and do whatever because AI can do all of it for us, right?
But here's the dilemma, um, which is, um, for the time being, AI is being trained [01:21:00] on what we humans have done.
[01:21:03] ck: That's right.
[01:21:05] Andy: Alright, are we, have, if you wanted to create one of the most powerful sources in the world, would you want it, and that's more powerful than us, would you really want to create something that's better at destruction than we are?
I mean, just look at our history of humankind. They're learning after off of everything we've done so and finding all the patterns to then be able to not just replicate but to create a new and create even better. And that's that's that's a real dilemma there. Um, so, uh, so it's just what what information do you feed the AI?
Right?
[01:21:44] ck: Yep.
[01:21:46] Andy: And so
[01:21:49] ck: I'm going to butcher this quote, but I'm trying to articulate this thought. Anything that you invent cannot extend the consciousness of the inventor [01:22:00] in response to that.
[01:22:04] Andy: Trying to think what that means. Um, yeah, it's kind of like Pandora's box is open. Like whatever your intention was, is that, is that what it means?
Like if your intention was good and your conscience was good, once it's invented, it's out there, um, and then can be used, I mean, you can just look at Oppenheimer, it was a bunch of scientists, but their intention to kill, I don't remember like 60 or 70, I don't remember a lot of people with an invention.
No. Um, and I guess when you say your consciousness, which obviously I think we had this conversation around. Like, how do we define consciousness? Um, because it's a pretty trendy word these days. Um, um, I, I might just define [01:23:00] consciousness as caring about the whole more than you do about yourself and realizing we're kind of all in this together.
We're all part as one this as future unfolds and compassion, love and kindness. Um, so can technology have compassion, love and kindness?
Well, not if they have a learning set that's not strongly, you know, geared towards examples of humanity's love, compassion, and kindness. Um, so I guess the inventor can have the highest level of consciousness possible. Um, but then, uh, I mean, it's a really good question. Can, can consciousness, [01:24:00] um, travel? Can consciousness be, um, be re not just replicated.
Um, yeah, can it be replicated and put into something else? I guess we'll find out. I have no idea.
[01:24:11] ck: Yeah, we'll find out. I mean, going back to the very, very early part of the conversation. It's based on feedback loop. And that's a basic, the most basic level, you know, what you feed, you know, the machine is gonna optimize for more of that.
So you feed it good data a la what we're trying to do, right? You know, enhance inner peace, love, connection, and you feed that data. It's gonna try to optimize for that. But if you feed it polarization, you know, here's how you divide. Here's how you get, you know, people to give me really anxious machines.
Gonna, and he has no judgment about it. It was like, okay, sure. No problem. It's going to keep getting into that more and more and more. So,
[01:24:58] Andy: so a lot, a [01:25:00] line that, um, Tristan Harris uses, um, give a human a fish, he'll eat for a day, teach a human to fish and they'll eat for a lifetime, teach AI how to fish, they'll master biology.
Chemistry, um, oceanography and fish, all the fish to depletion.
So seeing we are right now in a world of, um, desire for money and power on a global scale. On a global scale, and down to an individual company scale, right? And then also down to an individual human scale. Um, power to feed your children, money to feed your children.
If you were to program here, if a company, [01:26:00] a country, an individual, or a terrorist group, were to program their AI in today's world, there are very few people who are motivated towards programming towards peace on Earth. That's it. Um, which is quite easy, just wipe out the humans, there'll be peace on earth.
And that's probably what they would do. Um, or, uh, mental health, because there's not a lot of money. In people not needing pharmaceuticals, um, or commonness and security. So we don't have insecurity, so we don't need to buy brand name things to make it feel secure. Um, no, it would be, um, optimized. How do. Uh, influence people, um, so we can have more money and power.
That's what the majority of AI will be programmed as. And you're exactly right. So, with [01:27:00] that level of consciousness, uh, we're in a bit of a dilemma.
[01:27:10] ck: Beautiful. And is it an
[01:27:10] Andy: AI dilemma or is it a human dilemma? I'm focusing on
[01:27:14] ck: the, the Chinese were for, for crises, wei ji. There's danger and there's opportunity, right? So it's up to us to actually choose if we want to see it as a catastrophe, you know, up and coming, the sky's falling, or can we see it as an opportunity and what an opportunity to create something new and beautiful.
[01:27:34] Andy: So that's, that's where I'm focusing all my efforts. That's where I believe you're focusing all your efforts. I believe that's where most of the people on your program are focusing your efforts. And I also believe in the power of collaboration.
[01:27:46] ck: Oh, that's right. That's a perfect segue. Okay. That's the thing that I do want to ask you next.
Totally forgot about that. You are a community creator of extraordinary humans foster higher consciousness and [01:28:00] beautiful things in the world Say a little bit more about about that community and and what do you hope to accomplish with that?
[01:28:09] Andy: Yeah, I've been a community builder since I was a kid I ran underground nightlife for New York City private school kids when I was 16 years old, which was a community of Renting out lost spaces and having kegs of beer and DJs and dancing which made me a lot of money But also pulled together a community of like minded people then over the years Um, I, uh, would have lots of, um, talks at my house, and I would select, like, I've been very fortunate with who my mentors are, and people who have become really dear friends, like, top of their field in lots of different fields.
Bring them all collectively around a table to have dinner, and then a big whiteboard in my dining room, um, and we'd collect, we'd discuss world issues, but from different perspectives. Uh, to solve problems and then with the United [01:29:00] Nations for the 2030 Sustainable Development Goals, uh, you know, Ban Ki moon's office asked my for asked me for help to build the software to allow humans from all walks of life different villages different PhD and this that or the other thing to collectively Um, come up with first of all, what are the challenges of achieving these goals?
And then, uh, collectively come up with the solutions in terms of sustainable businesses to achieve these goals. So I, this is just kind of my, my other path of life for a long time. And then when COVID hit, um, we had a, Gathering about 20 very unique, highly skilled people like wildly. Um, Dr. Mark Hyman was there.
Uh, Chris Harris was there. Daniel Schmattenberg was there. I'm not sure if you know him. Oh, that's
[01:29:50] ck: good. Yeah.
[01:29:52] Andy: Um, uh, Dr. Mark Hyman, did I say Hyman? You did. Dr. Mark
[01:29:55] ck: Hyman.
[01:29:56] Andy: Dr. Anita Goel, the leading nanobiophysicist in the world was there. [01:30:00] Um, and, you know, a friend of mine, a time, but the mom Bay who ran, um, uh, innovation for the United Nations on a bonking and the subject was about entire consciousness.
And I was discussing, um, how social media and digital health tracking will lead to lower consciousness. It was kind of a beware thing. And what can we do about it? Also, Paul Dahlia. So Ray Dahlia's son was there was his apartment. Um, Paul Dahlia's apartment. Um, and then also in cope with it. So myself and my, my partner at the time, uh, work and relationship, uh, started something that we started calling it, uh, Task Force C 19, COVID 19, and
[01:30:45] ck: it became
[01:30:47] Andy: a weekly Zoom call.
It
[01:30:48] ck: started with
[01:30:49] Andy: 20, and then it grew to 40, and it started going global, and then a guy named Dr. David Nabarro, who ran the 2030 Agenda under Ban Ki moon, [01:31:00] came onto a call with us and said, We told them, okay, hey man, I got 40 amazing humans across the globe who want to be helpful in reducing suffering. So he joined one of our calls and he gave us, you know, things to do.
And with that, it grew to eventually 130 what I call Jedi's from around the world. And I'm talking like, uh, Navy SEAL, Navy SEAL, uh, SEAL Team 6 team leaders. Um, some of the best thinkers and globally. Um, and we were collaborating behind the scenes and with some of the best athletes in the world, musicians in the world, and, um, we saved a lot of lives during COVID, uh, particularly in the townships of South Africa during lockdown, we were able to get them 60, 000 people, um, food, masks, and sanitizer for 90 days.
Um, uh, when, if they didn't have it, they were locked down in shelter with nothing. There would have been riots [01:32:00] and a lot of blood. Um, and then as we kept progressing, we said, okay, this is a forever thing. So we turned it into Task Force Humanity.
[01:32:09] ck: Uh,
[01:32:10] Andy: it's, uh, a global group of Jedis around the world collaborating behind the scenes.
Each embedded inside of either a sports team, a band, uh, a military, um, um, scientific, uh, brain trusts, right? Um, and everybody's kind of collaborating behind the scenes. This is my community that I built with this amazing woman, Chioka. Uh, so it's still alive, uh, going extremely well. And I've been somehow for the past About two or three months, um, more Jedi's, four months or so, are starting to flow into my, flow towards me, through relationships with [01:33:00] people, and like a lot of people are like, oh, and you really have to meet this person because he belongs in this circle of task force commanding, um, and that's how Jan, right?
[01:33:11] ck: Hmm.
[01:33:13] Andy: Uh, he arrived in my life as, as you know, he's like three time world champion in Tai Chi. Right. So he's now my trainer and he's like, and Jan's like, Oh, you got to meet this one. He's the one who introduced us.
[01:33:25] ck: Right.
[01:33:26] Andy: Um, so I'm gathering or I'm gathering towards me. These superheroes around the world that, um, I connect with one another when they can help each other and sometimes I can connect four or five people to somebody who would never be able to get to those four or five people to help what their mission is.
Whether it's a business, non profit, cause, whatever. So I'm, yes, a community builder at heart, um, in my truth. Uh, I make zero [01:34:00] money from it. Uh, it is just pure passion, uh, being able to help those who are helping others.
[01:34:08] ck: I love that. Now you having, let's see, how do I ask this question? I can take that in, in so many different directions.
So, uh, I'll share a personal tidbit, and then I can ask you some tactical things, okay? So we did something similar during COVID. It was a group of innovators in our men's group, we did something similar as well. And it was a different scale of, you know, what we did. Not to invalidate anything that we did.
Not to glorify anything that you did per se, you know, it's just more like learning, right? So for me what you did worked because it continued to today. Today is 2023, you're still, you are still, you know, active and connecting people. What we did, it kind of just fell apart. If we actually ended in a, you know, And then some [01:35:00] kind of a fallout, right?
And then the men's group thing, we, um, you know, deliver some groceries and that was that. Right? So, all good, all fine. So, I'm curious, knowing what you know today, what do you think made your group work?
[01:35:17] Andy: Yeah, um, first of all, it was, It was the only thing I did, it was the only thing Chioka did, and we, three other people, um, who were, it's the only thing they did.
So it was five full time people giving up their time. Like, full time. Um, the next thing that we did is, um, and this was the biggest part, is how do you develop a, a new economic system of incentives?
[01:35:50] ck: Mm hmm.
[01:35:52] Andy: Uh, to, uh, Build a bond of trust and love and forget the love part, [01:36:00] reciprocity and belonging and how do you build an economic structure and a dear friend of mine, this guy Jordan Hall has
[01:36:08] ck: some,
[01:36:09] Andy: yeah, so he has, he was part of our group too.
[01:36:12] ck: Oh, great. Jordan, the smart guy. He's the Schmadenberger to me. So, yeah.
[01:36:19] Andy: Oh, yeah. Well, Jordan's a brother of mine.
[01:36:22] ck: Oh, great.
[01:36:23] Andy: Yeah, and he and a few others were very much mentoring me through this process. Like, so I had a, whatever advisory council, um, of how to do this without fucking it up because the stakes were so high.
Um, and as you know, his thing is we live in a game of game a. Which is kind of prisoner dilemma, right? Either it's a win, lose scenario. So we all lose, uh, and his goal and my goal is a world which he considers game B, which is when, when I think we'd all like that. Um, but how do you get there? [01:37:00] So what I was able to do, we, I should say we, I've just happened to be the hub, but it was a, we mentality, like.
Number one is, hey, I'm not the boss here. I'm nothing. I'm a connective tissue. I'm connective tissue for the group. Um, number two is, um, we made it, not intentionally, but it became this way. The bar to get into our group was extremely high. Number one is higher consciousness first. How we can collectively agree what higher consciousness means, right?
Compassion for others before self gain.
[01:37:38] ck: No.
[01:37:40] Andy: Uh, number two.
[01:37:41] ck: I like that a lot actually. Thank you for sharing that.
[01:37:44] Andy: Uh, the next thing is, um, Jedi skills. So top 10 in the world at whatever your expertise is. Um, that's pretty high bar, but we were lucky to start with a core of 20 that all [01:38:00] that all hit that.
And then when did you get
[01:38:02] ck: to in terms of the governance? Do you get to say that? Or is it more like a consensus thing?
[01:38:08] Andy: Oh, everybody knew that that that's who we were and what we were doing. And, and then the onus came down on me, which was real onus. So, um, uh, and, and this is one of the reasons why I've worked is I had to spend at least an hour, two hours with every single person, that's 130 people.
Uh, luckily for me, it allowed me to learn so much from these people. And they became my teachers and whatever they were, right. It, by the way, this part of it almost basically exploded my brain, taking in all that information. But then, um, the number one rule and kind of the only rule, it was the only rule, um, that, uh, how you get, you're no longer part of, of the, the community [01:39:00] is betrayal of trust, period, betrayal of trust.
So we, we, we made trust, uh, I made trust, we made trust the currency and because, um, each, the, the, oh, I forgot one thing. The third thing when, when we, it became too many people. People like, I can't think about the way 80 other people think get into their shoes to know who to put together with who my brain is going to explode.
Um, so we had to like, make it tighter. So the third thing was, if the new person, whatever their Jedi skills are could benefit. They were at 80 people, each of the other 80, and each of the other 80 could contribute to that one individual. That was the third. filter, then each new person came in, doubled the power or influence of the group mathematically.
Each time you [01:40:00] added one, it doubled it. Um, so then being in the group, um, what was amazing is we all had each other's backs and endorsed each other, um, and came to each other's help when needed, when, when needed. Right.
[01:40:17] ck: So quick question there. Trust is my opinion, earn over time. You know, we can just, I mean, you can go through, you know, some kind of, uh, really difficult situations together, but of Navy SEALs or ayahuasca ceremonies or Burning Man or whatever the thing that you're up to, you can accelerate that a little bit, but ultimately, in my opinion, this is my opinion, that trust is earned over time, you know, going through these type of cycles over and over again.
I'm curious. How did you develop that trust to the 80 other people in the group? I mean, one, that's really difficult already in my mind. And two, how do you [01:41:00] do that virtually? Cause you're not necessarily hanging out together, you know, I was the ceremony or Burning Man or some kind of environment where you were like living and breathing each other's, you know, stuff.
You know, I mean,
[01:41:13] Andy: so it's a great question. And here's the answer. Um, people, those people's personal reputations were at stake. Nobody was going to bring in somebody to the group unless they think it would increase their reputation in the group, right? And look better than themselves to them as curators.
Andy, I got somebody really incredible. Would you bet your reputation on it? Cause you're already in it. Now the fear of losing that, that's trust right there. Can I trust you? Right. Would you bet your reputation on it? Yes, I would. Then I spent two hours with them. [01:42:00]
[01:42:00] ck: So you do ask that question. Would you bet your reputation on it?
[01:42:03] Andy: I
[01:42:04] ck: see good then what's
[01:42:06] Andy: in the group? Because being in the group got you were able to borrow the reputation of all the other hundred and thirty
Do you betray trust you lose it?
[01:42:21] ck: So how do you betray trust? I mean again may bring out what we do in our men's group. You can tell me how you guys do it So in our men's group The way we do it is confidentiality is is of the all pop of most importance and if you break confidentiality You That's how you, you know, break our trust.
But how do you, what's betrayal trust for you?
[01:42:48] Andy: Um, if you intentionally put your own benefit [01:43:00] above the benefit of others,
[01:43:02] ck: I see
[01:43:04] Andy: and mislead flat out lie misrepresent, um, and uh, And it worked it worked because listen during the early days of covid These are some pretty powerful people, and there are lots of ways of making a lot of money off of PPE and such, right?
So if we're going to get behind somebody who's moving a lot of PPE, you better let us know. Are you, are you, are you taking advantage of people? Are you, you know, increasing prices, what not? You better let us know, right? And we moved a lot of PPE.
[01:43:57] ck: I'm sure. [01:44:00] I'm sure.
[01:44:01] Andy: But. Just be honest. What's in it for you?
[01:44:05] ck: Okay, so personal gain is all right. Just be transparent about it.
[01:44:10] Andy: 100 percent
[01:44:11] ck: within the group.
[01:44:12] Andy: As long as you're being honest about it, right? If you're hiding something and then like, then, then you're, you're not just reputing, you're not just risking your own reputation out there in the world. You're risking all of our reputation out there.
[01:44:29] ck: Yeah.
[01:44:31] Andy: So if I then, so what I was able to do after all of this, um, And we did all these breakout sessions because you're right, it was all done on zoom, which is insane to build that level of trust bonds on zoom from different political views, socioeconomic views, countries, the whole nine yards to be able to build that cohesion or coherence of trust [01:45:00] was hard as hell.
The point where I would be able to just say you and you. And you zoom call the three of you. I was then on each of the brick, the little direct introductions around somebody's project. Okay. I was on each way back up.
[01:45:20] ck: What did you say? I
[01:45:21] Andy: was on the zoom calls. So say there was person a, and I grew up person B and C
[01:45:27] ck: and
[01:45:27] Andy: the three of them didn't have a lot of one on one time in our breakout sessions and say, person a really needed help from person B and C.
You know, person B and C, I need to get on a Zoom call with person A and myself
[01:45:44] ck: tomorrow.
[01:45:46] Andy: These are busy people!
[01:45:48] ck: Yeah.
[01:45:49] Andy: But because I said, I asked for help, and usually it's like, it'll be good for all sides. It'll be good for both of you, and by the way, it's good for you to help this person because this person can help you with this and all this stuff.[01:46:00]
I was then on every single one of those meetings, introductory meetings, with a personal touch, and I would say, Yeah. Yeah. Yeah.
Person A, I love you, and this is your super skills. Person B, I love you, and these are your personal skills. Person A, I trust you implicitly. Person B, I trust you implicitly. Therefore, right now, I am putting my brand, my reputation, on the line that you guys can trust each other. And then they'd be off and running.
And it was, how do you do it? How do you build trust just like that? You endorse.
[01:46:35] ck: So, so question here, Andy. Again, I'm getting a little tactical if you don't mind. Um, you seem to get a little
[01:46:42] Andy: tactical and into the engineering of the social dynamics or no, no,
[01:46:46] ck: man. I really want to understand the social engineering aspect of it.
Cause when people talk in high level, like, yeah, win, win totally. And then I'm like, uh, how do you do it? Like, I don't know. So [01:47:00] here's why I want to, I mean, I want to learn from this, you know, in terms of, you know, personal interests, cause I don't know how to do it. Right. So I want to get into it and tactical.
Cool. You know, people can make strategic introductions all the time. But what I also found personally, when there's no, you know, stakes or when, when, when for everyone involved, people don't do it because they're not incentivized to do it. Right. If it's just a goodwill kind of a thing, okay, good luck. You know, here you go, whatever happens, happen.
And that. I just haven't found to be a longterm sustainable. So again, I'm curious to know your thought around it. Is it just, Hey, you guys meet and that's it. I'm out. Or is there some kind of, you know, here's how we can all win, win, win together and then, you know, the projects.
[01:47:51] Andy: Yeah,
[01:47:52] ck: here's your thoughts.
[01:47:53] Andy: I think part of part of what made it work and it's not just it's not self sustainable[01:48:00]
for larger group for even bigger. It's not self sustainable is number one is, um, I finished. I financially personally took care of my team that were full time on it. And, uh, Yeah, and then in these introductions throughout the entire thing, I've never made a nickel off of any
[01:48:23] ck: of it. Why not? Because you want to share the good in the world?
So that's, you're okay with that? Or is it just like more philanthropic, makes you feel good? Like, why not? I'm curious.
[01:48:38] Andy: During that period of time and now, during that, there's something more important than money. And it's trust and what can be done between the relationships. And listen to me, like, shoot, I don't have the biggest bank account in the world, but my friends and my relationships and my, if the world goes to [01:49:00] shit tomorrow, I've a hundred plus homes around the world that would take my kids in.
Well, that's clearly more valuable to me than more money.
[01:49:11] ck: Um,
[01:49:13] Andy: and you know, I, I, each of these people are like, they're, they're a family, I really think about them as family. Um, so, and by, and there are more warriors out there, so that warriors for light doing good in the world. So wait a second, if I care about all, and this is somebody else who's going to fight the battle on that field.
And somebody else fighted the battle on that field, of course, fucking give them more help. And they can all help each other. Holy shit. Now we've got a unified front. Of these Jedis out there with the force of the whole behind it, I get to just sit back and be happy and feel good about myself and not have to fight all those battles on my, on my own.
Right? And how do you leverage yourself? Right? Um, it's, you know, I mean, I talk a lot about like leadership these [01:50:00] days and like the problem, some of the problem with the world is we have leaders who want to be leaders. Right. This was exhausting. It was horrible. I mean, it personally exhausted me and like destroyed my, I mean, I, I think I shared with you, I ended up hospitalized for quite some time.
I destroyed my endocrine system. Oh, I didn't know that.
[01:50:19] ck: I see. Oh
[01:50:20] Andy: yeah.
[01:50:21] ck: Got it. So, so here's, here's one thing that I observe and I'm not pointing at you specifically, but you know, but I, I'm, I'm curious to know your thoughts about this. So I meet a lot of people whose very good nature. They want to, they're helpers of the world, right?
When you see something that needs help, they help. Yeah. But often they do it at the cost of personal health or finances or relationships or mental wellbeing, whatever the case may be. Right? Again, that point I use specifically. No,
[01:50:52] Andy: we can point all, all of those. You can point at me
[01:50:55] ck: just, just generalizing this and hence why the question about the [01:51:00] win win, win.
Comment so that way it's not you're not doing good at the cost when they say you generic you This person is not doing good at the cost of all of themselves Hence because i'm thinking long term not short term. So i'm curious. No having been through that having had the endocrine system You know, um all that stuff a lot cost on yourself How would you do it differently if you were to do it over again?
[01:51:29] Andy: Well, thank you for asking that because that's what i'm doing right now and it's You And very much learning from my mistakes, which literally
hospitalized me and put me too close to death
[01:51:43] ck: and
[01:51:45] Andy: lost. Fiance is not with me anymore. Um, not fun to watch your beloved. Walk around in tremendous amount of chronic pain, [01:52:00] um, while helping so many others, but not taking the time to help himself, in particular when she's a healer of sorts. Um, yeah, so I, and then I've been in the hospital for a long time.
Um, so, it's a silly little thing, but it's so important, and the number of people who said it to me, it was, well, it's a term, when I'm sure most of the people on, on your, on your series here, and most of the people who gave a shit this much. How many people have said to us, put your mask on first before helping others?
Well, maybe that's for others. That's not for me. Well, I learned that lesson the really hard way.
[01:52:43] ck: So knowing what you know today, having gone through that, what would you do differently? Like, I know I get the metaphor. I think everyone gets a metaphor, but like tactically, if you want to still do good in a scale that you want.
How would you, you know, put your mask on first? [01:53:00] Again, I want to make it tactical because people are like, yeah, yeah, yeah, you know,
[01:53:06] Andy: number one, quit drinking pure poison. Not good for you. Oh yeah.
[01:53:13] ck: Okay.
[01:53:14] Andy: Like, Oh yeah.
[01:53:16] ck: Like,
[01:53:17] Andy: like just sure. It'll make you numb. Whatever. I was in wicked chronic pain. Alcohol worked, but it was also killing me at the same time.
And when you need to zone out or whatever, Oh, alcohol, pretty good thing to make, don't do it. It's poison.
[01:53:36] ck: So,
[01:53:37] Andy: so if you, if you walk around with a clean, a pure, a pure body, um, alcohol, quit alcohol, eat healthier, exercise as an absolute mandatory priority. Um, meditate, spend time with loved ones, [01:54:00] not with your mind elsewhere, but actually being in love with your loved ones.
And then take some frivolous time just to go out and look at the trees or the stars or something and go like, holy shit, this place is insanely beautiful and magic. And then, the answer is take care of yourself, literally, and that's it. And of course, that's where a lot of the sound healing came in for me, really seriously.
Like, there
[01:54:30] ck: was,
[01:54:31] Andy: as part of my path through of healing, to restructure my own brain, my endocrine system, my nervous system, restructuring. Like, I shattered the whole thing. Right?
[01:54:44] ck: Got it.
[01:54:45] Andy: Yeah. So, um, sound healing, eating, sleep, so important, insanely important, and literally, and as soon as I did that, right, as soon as I [01:55:00] did that, I didn't have to put so much effort into things.
I literally didn't have to put so much effort into things. I'm now able to just walk through life with grace, and more people are just, insanely incredible people are just coming to me, and being like, oh, you're doing this, how can I help? Oh, you're doing this? Oh, how can I help? It's just pure grace. And just walk around, um, it's amazing what happens if you just walk around cheerful.
[01:55:35] ck: With your
[01:55:35] Andy: body and your mind cheerful. And, uh, so, no, I didn't. And by the way, my fiancée at the time literally, the number of times she told me, you have to take care of yourself, you have to take care of yourself, you have to take care of yourself, you're gonna die. Like, you're in the process of dying. And I just didn't listen.
And also, once you get into that cycle of adrenaline, just [01:56:00] constant going, you become addicted to adrenaline, you're a junkie of your own adrenaline. Um, so, but it's, it's not, it's not that much fun.
[01:56:12] ck: I, uh, I was speaking to another entrepreneur friend earlier today. And his business is doing really, really well and he has a family and takes his kids to Disneyland with a YPO plan.
So they basically privatized Disneyland and he can drink alcohol and all that stuff, right? Cool stuff that he's doing. And then he shared with me that he drinks. Uh, he makes sure to tell me, well, not a lot, not an alcoholic, but I drink three times a week. And when I drink, I drink a lot. Like, okay. So, uh, knowing what you know today, what would you say to my friend who, you know, perhaps is numbing some of the stress with alcohol?
Again, I don't know. This is [01:57:00] just, what would you say? Knowing what you know today, my friend,
[01:57:04] Andy: I would ask your friend.
Um, does he feel satisfied in the present moment? Is he enjoying being in his bedroom? And in his relationships in the present moment, um, is he feeling love from his loved one as strong as he could? And is he giving love as much as he could to those around him? To me, that's the greatest joy in life.
Receiving love and giving love.
If he's not doing that, if he's not feeling that way, alcohol is used to numb.
If you're numb, you might be numb to all the negative shit that's going on, but you're [01:58:00] also numbing yourself. To feeling the true, authentic love that's around you and your ability for others around you to get the true, pure love from you. Um, and I've, you know, I empathize clearly in there. Um, but to like right in this moment and I try all the time and this whole thing.
Oh, be here now. Be in the moment. Be present. Oh, no. Well, what does that actually mean? Right? Um, to actually be in the moment, appreciating the person who's there with you, appreciating, listening, being full, 100 percent attention without thinking what you're going to say next. No, just absorbing. And then sharing without intention, but just sharing pure love or whatever, uh, towards that person.
Uh, it's just, um, such a nicer, [01:59:00] easier way to live. Such a nicer, easier way to live. And when you get to that state of pure love, Pure body and mind. Things just come to you. I don't fucking know. Call it manifest. Call it all the books that we all read. Uh, call it, you know, the power of thought. Call it, you know, whatever.
It just works. And this is something you like, for decades, did it purely through effort.
[01:59:33] ck: Purely
[01:59:33] Andy: through effort. So clean your body, and then literally clean your body. Clean your vessel. Down to the cellular level. Just get the, do it. Get the bad shit out. If you want to think clearly and be able to appreciate clearly and see straight without foggy vision or distorted vision, then don't drink.
Clearly, it distorts our vision. It distorts our capacity to feel. So, yeah, don't, yeah, that's what I would say. And then I would say [02:00:00] directly to your friend, I get it, and you are loved. Love yourself, and if you love yourself, why in God's name would you put poison into it? Why in God's name would you ever do that?
Then you can say you love yourself, but that's just words.
[02:00:21] ck: Beautiful, Andy. I know we cover a lot of ground. Is there anything that if people are listening, if they can leave with one idea? I mean, that last one was amazing, but if there's anything that you wanted to say, anything I didn't ask you.
[02:00:37] Andy: Yeah, um, I think so. Um, and if there's one thing that I think is so fundamental for where we are in the world, uh, it's this concept of forgiveness.
Um, you know, I did a lot of work in Rwanda and there's a genocide in Rwanda, Hutu versus Tutsi. Thank you. And the Hutu slaughtered a million Tutsi in a hundred days. Wow. A [02:01:00] hundred days. And when I was over there, somehow, the Tutsi forgive the Hutu. Wow. From neighbor to neighbor. Like, they killed each other with, like, axes and pitchforks, like, whatever they had.
And now, they forgive each other. And the reason why is because the history of how, you know, the Belgium came into Rwanda, uh, And to get economic benefit out of that part of Africa, they gave a minority of the population these cards that say Tutsi. They say, you're the powerful, you're the upper class, and you always will be.
And the other group, they say, you're the Hutu, you're the lower class, and you always will be. If you have a, like, powerful minority, uh, ruling a less powerful minority, um, all of a sudden, just the power of radio, that's all they had is communication. Uh, the word came out that says, uh, Hutu, starts, whatever, go [02:02:00] after them, the, the Tutsi are about to slaughter you and your families.
Just do radio, okay? Killing, okay? So the reason why they were able to forgive, They were misled, they mistrusted, they got manipulated by power sources of media, right? Look around the world today. Um, yes, there's unforgivable things. Individuals did. Um, and some people, even today, I've seen friendship being ripped apart about what's going on in the Middle East.
Forgive each other. We're all, we're each living in our own reality. It's a tough time. Forgive what people say and also forgive yourself for things that you might've done where, I mean, clearly after what I shared at the beginning of this, I forgive myself.
[02:02:54] ck: Awesome.
[02:02:55] Andy: Right. And so, forgiveness. Please, in today's world, just forgive.
Is there a, [02:03:00]
[02:03:00] ck: it's actually on that notes, I mean, again, one of those ism that people say, but when it comes to doing it. When you do it, right? So is there a technique? Is there a place you can send people to so that way people really want to do the work and forgiving themselves or people forgive those that harmed them in the past?
They have some concrete steps they can take.
[02:03:24] Andy: Yeah. Um, um, very easy is say to somebody, um, the way I perceived what you were communicating, um, was very harmful to me, but that's the way I perceived it. Was that your intention? And if you just do that, you, first of all, you put, you put the onus on yourself. I perceived.
What you communicated, not what you intended to communicate. I perceived it as harmful and painful and angered me. Was that your intention? You're [02:04:00] giving somebody the benefit of the doubt, and you're also giving the ability for them to explain themselves. And the next thing you can do is just listen.
People have the right to have their, I mean, it's just the world of media and their history and everything they've been surrounded with, just listen without feeling attacked. And maybe if you get in there, you can actually feel a little bit of it. Um, compassion towards their life story, um, which will make them less angry and make them actually see more eye to eye with you, and then you can actually get into a conversation.
Um, and, you know, but, you know, shit, if everybody did that, then we wouldn't have a problem with like conflict resolution. Like, there it is, you know, just say you hurt, my feelings are hurt, but it's my feelings and from my own perception. Was that what you intended to me? You didn't listen, you just shut up and listen.
[02:04:59] ck: Andy, thank you so [02:05:00] much. I really acknowledge you for the openness to look at the origin story. You know, my intention was to use your experience as a teaching moment and I really appreciate you being willing to go there, you know, and then also at the end of it, also see that, hey, it's user error, user intent, as well as the designer intent and see it from that way.
Really, really appreciate the master class around, you know, how media industry, you know, the the machinery is run and how this could potentially be weaponized with more AI and why we are need to be more Cognizant about our awareness. How are we being influenced? Yeah, and You know just really appreciate everything that you do in our frequency is medicine.
It's a beautiful articulation and I love it I can't wait to experience it personally. I'm a big fan of sound healing. I see a lot of potential as we talked [02:06:00] about, you know, with the possibility of that immersion. Anyways, we can nerd out with a few more hours. All in all, just really appreciate you being who you are and taking your expertise in doing something good in the world.
Thank you so much for being here on Noble Warrior.
[02:06:18] Andy: Thank you. Thank you for everything you do and to your listeners for everything they do. And, uh, Yeah, it's been an honor
Human
In the current stage of human evolution, we possess the resources and technologies to elevate all of humanity to a state of comfort, satisfaction, and joy. We have the capacity to feed, educate, shelter, motivate, and protect every individual on this planet while also rejuvenating our environment. Yet, despite these advancements, we have not fully embraced this potential, and our actions continue to cause significant harm to each other and the planet, leading to potential irreversible suffering.
Over the past eight years, Andy Russell has dedicated himself to addressing some of humanity's greatest challenges that contribute to human suffering. His primary focus now is on the root cause of all human suffering: mental health. Andy refers to this overarching issue as "The Human Dilemma," believing that the solution lies in compassion, forgiveness, and love.
One of Andy's current projects involves democratizing access to the healing power of sound frequencies to combat the growing mental health crisis. His new venture aims to deliver personalized sound medicine directly to subscribers' smartphones, offering an innovative approach to mental well-being.
During the COVID-19 pandemic, Andy led an underground group of 130 experts from around the globe who collaborated to mitigate the suffering caused by the virus. This group, initially known as Task-Force C-19, was later renamed Task-Force Humanity, reflecting its broader mission.
Early Life and Career
Born in 1971, Andy Russell is a pioneering innovator in digital medi…
Read More
Here are some great episodes to start with.