1 00:00:00,000 --> 00:00:00,540 Unknown: Tim Fulton, 2 00:00:08,340 --> 00:00:10,440 Tim Fulton: welcome to the confluence cast presented by 3 00:00:10,440 --> 00:00:15,360 Columbus underground. We are a weekly Columbus centric podcast 4 00:00:15,360 --> 00:00:19,320 focusing on the civics, lifestyle, entertainment and 5 00:00:19,320 --> 00:00:23,480 people of our city. I'm your host. Tim Fulton, this week, 6 00:00:23,840 --> 00:00:28,220 when we talk about artificial intelligence, the focus is often 7 00:00:28,220 --> 00:00:33,260 on big tech or headline grabbing breakthroughs, but for Kaz 8 00:00:33,260 --> 00:00:38,900 Maxwell, co founder and CEO of AI owl, the story is much closer 9 00:00:38,900 --> 00:00:43,060 to home. His company is helping Ohio's workforce schools and 10 00:00:43,060 --> 00:00:48,220 businesses go from Ai curiosity to practical integration backed 11 00:00:48,220 --> 00:00:51,460 by state funding and partnerships with Intel and Khan 12 00:00:51,460 --> 00:00:55,900 Academy. In this conversation, Cass and I talk about how AI can 13 00:00:55,900 --> 00:00:59,440 be a tool for empowerment, rather than replacement. What it 14 00:00:59,440 --> 00:01:02,760 looks like to train employees and students in hands on ways 15 00:01:03,060 --> 00:01:06,960 and why the future of AI readiness in Ohio may hinge on 16 00:01:06,960 --> 00:01:10,680 meeting people where they are. You can get more information on 17 00:01:10,680 --> 00:01:13,560 what we discussed today in the show notes for this episode at 18 00:01:13,560 --> 00:01:21,320 the confluencecast.com enjoy the interview. Sitting down here 19 00:01:21,320 --> 00:01:25,820 with Cass Maxwell, the co founder and CEO of AI owl. Cass, 20 00:01:25,820 --> 00:01:29,240 how are you good? How are you today? I'm good. Tell us about 21 00:01:29,300 --> 00:01:31,400 ai ai owl. Ai owl, 22 00:01:31,400 --> 00:01:34,340 Cas Maxwell: well, we are a little bit of a complex 23 00:01:34,340 --> 00:01:37,880 organization, but at the end of the day, we do workforce 24 00:01:37,880 --> 00:01:41,140 upskilling with AI. So that's that's our foundational thing 25 00:01:41,140 --> 00:01:44,200 that we do is we go into organizations teach their team 26 00:01:44,200 --> 00:01:47,920 how to basically go from demystification to integration 27 00:01:47,920 --> 00:01:48,460 with AI, 28 00:01:48,580 --> 00:01:51,340 Tim Fulton: okay, talk to me so. And when you say organizations 29 00:01:51,340 --> 00:01:54,220 that could be businesses, that could be schools, could that 30 00:01:54,220 --> 00:01:55,780 even be like nonprofits? 31 00:01:55,780 --> 00:01:58,720 Cas Maxwell: Yeah, it can be any basically, if you have w2 32 00:01:58,840 --> 00:02:02,400 employees, okay, you qualify for what we're doing, 33 00:02:02,460 --> 00:02:05,880 Tim Fulton: okay? And it's, it's funded by the state, so these 34 00:02:05,880 --> 00:02:11,460 folks can sort of basically apply for it. You do you do the 35 00:02:11,460 --> 00:02:14,520 approval process? Or do you, is it basically an automatic 36 00:02:14,520 --> 00:02:15,900 approval? How does that work? To 37 00:02:16,200 --> 00:02:18,960 Cas Maxwell: to the, yeah, the state does the actual approval 38 00:02:18,960 --> 00:02:22,100 of it. So we'll help guide the organization through that 39 00:02:22,100 --> 00:02:26,000 application. Very simple, takes, like 15 minutes, guide them for 40 00:02:26,000 --> 00:02:29,840 the application. They submit it. You know, the state will award 41 00:02:29,840 --> 00:02:32,660 that grant agreement. And then from there, we can get to 42 00:02:32,660 --> 00:02:35,240 training and get to upskilling, and then the state will take 43 00:02:35,240 --> 00:02:37,220 care of it, okay, which is amazing. Talk to 44 00:02:37,220 --> 00:02:40,420 Tim Fulton: me about what the program looks like, how long it 45 00:02:40,420 --> 00:02:42,460 is, and then what the outcome is. 46 00:02:43,000 --> 00:02:44,860 Cas Maxwell: So the you're talking to tech credit, like our 47 00:02:44,920 --> 00:02:48,940 training program, your training program? Yeah, yeah. So what it 48 00:02:48,940 --> 00:02:51,700 looks like is, we sit down with the organization to start with, 49 00:02:51,880 --> 00:02:54,820 figure out kind of, you know, what are they trying to achieve 50 00:02:54,820 --> 00:02:58,240 with AI number one, where are they on their AI journey? And 51 00:02:58,540 --> 00:03:01,920 then we have kind of modular based classes. So whether you 52 00:03:01,920 --> 00:03:04,680 want to do data analytics, whether communication marketing, 53 00:03:04,980 --> 00:03:07,140 grant writing, you know, whatever that is, you want to 54 00:03:07,140 --> 00:03:10,800 accomplish. We then bring in trainers for about eight to 10 55 00:03:10,800 --> 00:03:13,680 hours over four weeks. Okay? They're actually in person, so 56 00:03:13,680 --> 00:03:16,380 you're opening up computers, getting on the desktop, whatever 57 00:03:16,380 --> 00:03:19,140 you have, and we're actually in person with you, doing a 58 00:03:19,140 --> 00:03:23,840 workshop based training. Okay? And then what is tech cred? So 59 00:03:23,840 --> 00:03:26,720 tech cred is the state of Ohio grant, okay? 60 00:03:27,200 --> 00:03:30,740 Tim Fulton: It was credential, like an actual thing, okay, it 61 00:03:30,740 --> 00:03:33,140 Cas Maxwell: is. And then, specifically, we're teamed up 62 00:03:33,140 --> 00:03:36,620 with Intel as well. So everybody that goes through ours receives 63 00:03:36,620 --> 00:03:39,920 an additional Intel digital readiness certificate that they 64 00:03:39,920 --> 00:03:41,920 can put on their resume, you know, put on their wall, 65 00:03:41,920 --> 00:03:44,260 whatever they want, and that just shows that they've went 66 00:03:44,260 --> 00:03:47,440 through that course and that, you know, they're AI literate, 67 00:03:47,440 --> 00:03:48,100 pretty much, 68 00:03:48,160 --> 00:03:51,220 Tim Fulton: okay, it looks like, and what, just nitty gritty, 69 00:03:51,220 --> 00:03:54,460 like, Do you have a preferred platform that you guys use? Are 70 00:03:54,460 --> 00:03:59,200 you trying like, let's say it's a medium sized business and they 71 00:03:59,200 --> 00:04:03,180 have some security concerns about putting their data into 72 00:04:03,180 --> 00:04:05,760 it. Are you, basically, I'm asking, Are you platform 73 00:04:05,760 --> 00:04:08,400 agnostic? Are you making recommendations for which 74 00:04:08,400 --> 00:04:13,500 platforms to use, whether that be anthropic or open? Ai, what 75 00:04:13,500 --> 00:04:13,620 do 76 00:04:13,620 --> 00:04:17,460 Cas Maxwell: you use? We use about 40 different tools. Okay, 77 00:04:17,460 --> 00:04:20,360 so it's going to depend on your goal. So if you're, if you're 78 00:04:20,420 --> 00:04:24,200 looking at it can be as chatgpt that everybody knows, and how do 79 00:04:24,200 --> 00:04:27,740 we dive deep into that? Or it can be way more advanced 80 00:04:27,740 --> 00:04:31,700 features, building a computer vision system or working with 81 00:04:32,180 --> 00:04:36,980 specific data analytics tools, anything like that. So there's 82 00:04:36,980 --> 00:04:42,820 not one right answer in AI. It's very user base and very subject 83 00:04:42,820 --> 00:04:43,960 based as well, yeah, 84 00:04:44,380 --> 00:04:46,660 Tim Fulton: and I assume that you sort of have a philosophy 85 00:04:46,660 --> 00:04:50,620 that AI is a complement to because this is a workforce 86 00:04:50,620 --> 00:04:55,360 grant, right, yeah, okay, so AI, philosophically for you, is a 87 00:04:56,380 --> 00:05:00,900 job enhancer, not a job replacer. You. Talk to me about 88 00:05:00,900 --> 00:05:01,920 your philosophy there. 89 00:05:02,400 --> 00:05:04,620 Cas Maxwell: Yeah, we it's a tool, okay? It's a tool. That's 90 00:05:04,620 --> 00:05:08,280 what we always say. It's a tool, you know. It's another tool in 91 00:05:08,280 --> 00:05:13,320 your tool belt to help you be better at your job, to excel, to 92 00:05:13,320 --> 00:05:16,440 push the limit, you know, go to the next horizon. You know, it's 93 00:05:16,440 --> 00:05:20,480 your thought partner, you know. Now, instead of going to you 94 00:05:20,480 --> 00:05:22,700 know the guy that's been with the company for 35 years, 95 00:05:22,700 --> 00:05:25,280 instead of going to his office and asking him these questions, 96 00:05:25,280 --> 00:05:27,980 now you have that same thought partner at your fingertips that 97 00:05:27,980 --> 00:05:31,880 you can diagnose, that you can work through, like you said, to 98 00:05:31,880 --> 00:05:35,180 accomplish those problems and to be a master Problem Solver 99 00:05:35,180 --> 00:05:36,320 within your organization. 100 00:05:36,560 --> 00:05:40,960 Tim Fulton: Who do you find are the most skeptical of that 101 00:05:41,020 --> 00:05:45,940 philosophy. Is it the, basically, the leaders that are 102 00:05:45,940 --> 00:05:49,840 looking at it as a security risk, or the employees that may 103 00:05:49,840 --> 00:05:53,980 feel I am basically just empowering AI to take over my 104 00:05:53,980 --> 00:05:56,740 job by telling it how I do my job? 105 00:05:56,800 --> 00:05:59,980 Cas Maxwell: Right? That? Yeah, that's the those are definitely 106 00:05:59,980 --> 00:06:07,800 the two arguments, I think, the most skeptical. I mean, I 107 00:06:07,800 --> 00:06:11,520 definitely feel like it's, it's at the top, bringing it into the 108 00:06:11,520 --> 00:06:15,300 organization, okay, which may be counterintuitive to think about, 109 00:06:15,300 --> 00:06:17,760 but, you know, it's a, it's a scary thing to bring into your 110 00:06:17,760 --> 00:06:20,220 organization. Yeah, and yes, there's all these, you know, 111 00:06:20,220 --> 00:06:23,240 promises. Oh, it's gonna, you know, do this or do that. But a 112 00:06:23,240 --> 00:06:26,000 lot of people aren't buying in as fast as you would think, 113 00:06:26,000 --> 00:06:28,940 yeah. And I think that just comes down to truly 114 00:06:28,940 --> 00:06:31,760 understanding, you know, what can it do? How? What does it 115 00:06:31,760 --> 00:06:34,400 mean to be aI ready? You know, that's we're in that chicken or 116 00:06:34,400 --> 00:06:37,760 the egg moment where an employer doesn't really realize what it 117 00:06:37,760 --> 00:06:40,400 means to hire a employee that is already AI ready. You know, what 118 00:06:40,400 --> 00:06:43,900 that person is gonna bring in the organization, the value and 119 00:06:43,900 --> 00:06:47,140 what value your company is going to bring, being AI ready. So you 120 00:06:47,140 --> 00:06:49,720 know it's going to come in a wave. You know, if you look what 121 00:06:49,720 --> 00:06:54,340 was it 15 years ago, 20 years ago, if you had Excel on your 122 00:06:54,340 --> 00:06:56,860 resume, you're like, what does that mean? What does that what 123 00:06:56,860 --> 00:06:59,860 does that mean to know Excel, and now, you know we're past 124 00:06:59,860 --> 00:07:02,640 that frontier. People go, oh, you know excels everything when 125 00:07:02,640 --> 00:07:05,160 you know how to use that. So I think we're in that same weird 126 00:07:05,400 --> 00:07:09,300 moment with AI right now where people don't fully recognize the 127 00:07:09,300 --> 00:07:11,700 power of having that on your resume. Okay, yeah, 128 00:07:11,940 --> 00:07:14,520 Tim Fulton: talk about the the work that you're doing in 129 00:07:14,520 --> 00:07:17,400 schools. I think Caldwell high is the one that you guys have 130 00:07:17,400 --> 00:07:20,480 highlighted as like, look at what we did for these kids. So 131 00:07:20,600 --> 00:07:22,760 Cas Maxwell: you're probably referring to tip city over 132 00:07:22,760 --> 00:07:26,360 maybe. Okay, we have worked with Coldwell, though we've worked 133 00:07:26,360 --> 00:07:31,580 with over 70 different districts throughout Ohio. Tip city in 134 00:07:31,580 --> 00:07:35,780 particularly, that was where we rolled out the country's first 135 00:07:35,840 --> 00:07:40,240 AI course. So okay, a nine through 12 course, a full 136 00:07:40,240 --> 00:07:44,260 semester, and we started with credit recovery students. Okay? 137 00:07:44,320 --> 00:07:47,500 So these were students that brilliant, students like 138 00:07:47,500 --> 00:07:49,960 brilliant, but they just weren't engaged with the traditional 139 00:07:49,960 --> 00:07:52,900 learning environment, okay? And, you know, no fault of their own. 140 00:07:52,900 --> 00:07:55,180 That's just, you know, and I don't know what credit recovery 141 00:07:55,180 --> 00:07:58,540 is, sorry, you know. So basically, not graduate, okay, 142 00:07:58,540 --> 00:08:02,340 yeah, for whatever reasons. But the main thing was just the 143 00:08:02,340 --> 00:08:05,280 traditional school wasn't, you know, they weren't vibing with 144 00:08:05,280 --> 00:08:08,460 that. And so we created this course. Decided to pilot it 145 00:08:08,460 --> 00:08:10,800 there, because we knew, hey, if we could work with these 146 00:08:10,800 --> 00:08:13,680 students and we could make this successful, then we can work 147 00:08:13,680 --> 00:08:16,860 with any student. Okay? And in it, we spent all semester doing 148 00:08:16,860 --> 00:08:19,620 it. These students are now very good friends of the 149 00:08:19,620 --> 00:08:22,160 organization. They've they're actually speaking in front of 150 00:08:22,280 --> 00:08:25,580 hundreds of people. You know, we're just a semester ago, they 151 00:08:25,580 --> 00:08:27,860 were not even want to come to school. So, okay, they've had an 152 00:08:27,860 --> 00:08:31,100 amazing transformation, been assigned credits with this 153 00:08:31,100 --> 00:08:34,100 course. And so that's one thing, and then, you know, other things 154 00:08:34,100 --> 00:08:37,160 are, we're doing PD training for teachers. We're doing 155 00:08:37,160 --> 00:08:41,020 facilitator training. Khan Academy is one of our partners, 156 00:08:41,020 --> 00:08:44,440 so we're bringing, you know, con migo, their one on one tutor to 157 00:08:44,440 --> 00:08:47,500 schools at no cost. Okay, so we have so many different things, 158 00:08:47,740 --> 00:08:50,980 and it's all you know through state funded programs as well. 159 00:08:50,980 --> 00:08:55,180 So we know budgets are tight with schools. Yeah, it's no it's 160 00:08:55,180 --> 00:08:59,620 no secret. And so trying to find a way to alleviate the budget 161 00:08:59,620 --> 00:09:02,580 headache as well. That's that's a big part of our mission. 162 00:09:02,640 --> 00:09:05,460 Tim Fulton: And you are a for profit organization. You are, 163 00:09:05,460 --> 00:09:08,220 you are a business, Yes, correct. Talk to me about sort 164 00:09:08,220 --> 00:09:10,980 of the founding story and how you guys got started. 165 00:09:11,760 --> 00:09:14,700 Cas Maxwell: Yeah. So I was originally restaurant tour. 166 00:09:14,700 --> 00:09:18,180 Still am. Okay, stay. Shout out, Paul and Steve. You know, 167 00:09:18,180 --> 00:09:20,720 business partners there. We have a good time. We're almost 10 168 00:09:20,720 --> 00:09:25,820 years in now and then, about two and a half years ago, two years 169 00:09:25,820 --> 00:09:30,620 ago, when chat GPT dropped, yep, November 22 myself and my 170 00:09:30,620 --> 00:09:34,460 business partner, trace, we had a late and great friend. He 171 00:09:34,460 --> 00:09:37,640 passed away in the fall, but he came to us and was like, Hey 172 00:09:37,640 --> 00:09:41,020 guys, I've seen calculators, I've seen desktop computers. 173 00:09:41,020 --> 00:09:44,620 I've seen internet I've seen all of this stuff. This is bigger 174 00:09:44,620 --> 00:09:48,580 than all that, right? And how do you guys grapple with this? How 175 00:09:48,580 --> 00:09:51,100 can you build a company around it? And so that was really our 176 00:09:51,100 --> 00:09:54,580 spark to Okay, let's get together, let's figure this 177 00:09:54,580 --> 00:09:57,400 stuff out, and let's build a company around it. And so that 178 00:09:57,400 --> 00:10:00,900 was really the founding spark to where we are today. Okay, okay. 179 00:10:02,220 --> 00:10:04,980 Tim Fulton: So you, you've talked about your sort of, it's 180 00:10:04,980 --> 00:10:07,380 not a direct partnership with the state, but it is the state 181 00:10:07,380 --> 00:10:10,320 funded programs. You have your partnership with Khan Academy. 182 00:10:10,800 --> 00:10:14,580 These are big entities. Can you talk about sort of, first of 183 00:10:14,580 --> 00:10:17,460 all, navigating that? Is that your first time navigating that 184 00:10:17,460 --> 00:10:21,980 sort of thing, you're a kind of young guy. And then also, how do 185 00:10:21,980 --> 00:10:24,860 you choose what is a good partner to work with? 186 00:10:26,960 --> 00:10:29,420 Cas Maxwell: That was my first time, okay, navigating that, 187 00:10:30,740 --> 00:10:33,740 anything on the scale of becoming partners with, you 188 00:10:33,740 --> 00:10:36,980 know, an Intel or Khan Academy. I mean, these are, as you know, 189 00:10:36,980 --> 00:10:41,080 they're huge organization, yeah, so that really, you know, I got 190 00:10:41,080 --> 00:10:44,380 to give kudos to trace. I mean, he really navigated that 191 00:10:44,380 --> 00:10:49,000 landscape well with Intel, which, you know, Intel was also 192 00:10:49,000 --> 00:10:52,180 partnered up with Khan Academy, so that's so that's kind of what 193 00:10:52,180 --> 00:10:55,780 opened the door there, okay. And then just working with those 194 00:10:55,780 --> 00:10:59,020 organizations, I think as we got to know each other more and 195 00:10:59,020 --> 00:11:02,100 more, we saw that value of, okay, we're really boots on the 196 00:11:02,100 --> 00:11:05,220 ground training these people, you know. How can we use this 197 00:11:05,220 --> 00:11:10,020 intel content that has been created for two decades now that 198 00:11:10,020 --> 00:11:12,720 they're refining How can we synthesize this and bring it 199 00:11:12,720 --> 00:11:13,980 down to the everyday person? 200 00:11:14,040 --> 00:11:17,040 Tim Fulton: Okay? So that the value add, would you say, sorry, 201 00:11:17,040 --> 00:11:21,740 I'm skipping. Okay. Is that what your company basically does is 202 00:11:21,740 --> 00:11:27,200 it's synthesizing this, the set of modules, probably, into 203 00:11:27,200 --> 00:11:30,380 something that's digestible. And you're essentially an education 204 00:11:30,380 --> 00:11:31,580 company, absolutely, 205 00:11:31,640 --> 00:11:35,240 Cas Maxwell: okay, yeah, it's trying to I'm sure you've sat 206 00:11:35,240 --> 00:11:38,060 through some AI conferences. I'm sure everybody that's listening 207 00:11:38,060 --> 00:11:41,320 to this has sat through some sort of AI thing, and it's 208 00:11:41,320 --> 00:11:44,920 always so 30,000 foot view, yeah, and people walk out going, 209 00:11:45,940 --> 00:11:48,160 Tim Fulton: you know, I don't know what I'm supposed to do 210 00:11:48,160 --> 00:11:48,460 with 211 00:11:48,460 --> 00:11:50,800 Cas Maxwell: that. How you build an algorithm isn't going to help 212 00:11:50,800 --> 00:11:54,580 me, you know? I mean, and so we understand that. And so we're 213 00:11:54,580 --> 00:11:58,660 trying to come into the person that's working whatever they own 214 00:11:58,660 --> 00:12:01,020 a roofing company, or they're the Big Brothers, Big sister 215 00:12:01,020 --> 00:12:03,960 organization or or whatever they are. You know, we're gonna meet 216 00:12:03,960 --> 00:12:06,720 them where they are and show them how to integrate these 217 00:12:06,720 --> 00:12:09,300 tools on a very person to person level. 218 00:12:10,440 --> 00:12:13,140 Tim Fulton: Talk to me about what makes you and trace 219 00:12:13,320 --> 00:12:16,020 basically qualified. Like, why are you the best people to I 220 00:12:16,140 --> 00:12:19,020 don't know what traces background is, but sort of like, 221 00:12:19,320 --> 00:12:23,900 it seems like an interesting entrepreneurial leap to go from 222 00:12:23,900 --> 00:12:28,700 restaurant tour to, oh, I own and run an educational AI 223 00:12:28,700 --> 00:12:31,820 company that services organizations with state grant 224 00:12:31,820 --> 00:12:32,120 money. 225 00:12:32,120 --> 00:12:36,020 Cas Maxwell: Yep. What's the pizza guy doing in Yeah, right 226 00:12:36,500 --> 00:12:40,300 to kind of answer the first question. So trace, he comes 227 00:12:40,300 --> 00:12:42,820 from manufacturing, also inventing. So he's an inventor 228 00:12:42,820 --> 00:12:46,600 by nature, an engineer. Yes, okay, yep, got it. So that's his 229 00:12:46,600 --> 00:12:50,620 background. And I think what qualifies us, I think the 230 00:12:50,620 --> 00:12:54,280 beautiful thing about this space, you know, when we're 231 00:12:54,280 --> 00:12:56,500 looking not, not AI as a whole, because it's been around for, 232 00:12:56,500 --> 00:13:00,120 you know, since, like, the 1950s but the actual LLM side of 233 00:13:00,120 --> 00:13:03,000 things, the natural language, where you and I are, we're 234 00:13:03,060 --> 00:13:05,580 talking it, you know, like you're talking right here is 235 00:13:05,580 --> 00:13:09,480 that it's new, yeah, it's, it's just a couple years old. So, so 236 00:13:09,480 --> 00:13:12,780 what does qualify an expert in that, other than people working 237 00:13:12,780 --> 00:13:15,660 at open? AI that crack that code, you know? So we were at 238 00:13:15,660 --> 00:13:18,360 that starting wave. So I think just getting ahead of the 239 00:13:18,360 --> 00:13:21,260 general population, working with experts, surrounding ourselves 240 00:13:21,260 --> 00:13:23,000 with amazing people. 241 00:13:24,620 --> 00:13:27,440 Tim Fulton: You don't mean, does it kind of feel like I'm asking, 242 00:13:27,440 --> 00:13:31,940 like, what qualifies the guy who invested in Apple in 1980 to 243 00:13:31,940 --> 00:13:35,900 have invested in apple? It's like, it doesn't matter. I got 244 00:13:35,900 --> 00:13:36,560 there first. 245 00:13:36,560 --> 00:13:39,860 Cas Maxwell: Yeah, and I hope that doesn't sound pretentious, 246 00:13:39,860 --> 00:13:43,240 but it's just like it didn't exist before a couple years ago. 247 00:13:43,240 --> 00:13:47,920 So, so, right, you know, and that's like anything, you dump 248 00:13:47,920 --> 00:13:51,340 yourself into your work, you you know, you find that passion and 249 00:13:51,340 --> 00:13:53,980 find how it solves a problem. You know, that's always the root 250 00:13:53,980 --> 00:13:56,800 of everything. You know, what problem is it solving? So, so, 251 00:13:56,800 --> 00:14:00,100 yeah, I mean, it's, it's, how do you become an expert? 252 00:14:00,160 --> 00:14:02,640 Tim Fulton: Well, and as I have thought about it these past 253 00:14:02,640 --> 00:14:05,400 couple of years, the question is always like, Okay, well, what's 254 00:14:05,400 --> 00:14:10,020 it going to be able to do next? And how do you get because 255 00:14:10,080 --> 00:14:13,440 there's going to be a couple of big winners, right? There's 256 00:14:13,440 --> 00:14:19,680 going to end up being an IBM, a Microsoft and an apple. But how 257 00:14:19,680 --> 00:14:24,200 are, how can you specialize and become the, you know, the 258 00:14:24,200 --> 00:14:27,380 Hewlett Packard, the company that sort of is, like, 259 00:14:27,980 --> 00:14:32,600 supplanting and adding additional benefit, or, like, 260 00:14:32,720 --> 00:14:35,240 how are you going to, and I don't know, I don't have a 261 00:14:35,420 --> 00:14:38,660 philosophy about whether this is ethical. How are you going to 262 00:14:38,660 --> 00:14:43,480 create the AI therapist? How are you going to create, God forbid, 263 00:14:43,540 --> 00:14:46,360 but it's gonna fucking happen, the AI dating 264 00:14:46,360 --> 00:14:49,600 Cas Maxwell: app. And I've thought about that already. 265 00:14:49,960 --> 00:14:50,380 Yeah, 266 00:14:50,380 --> 00:14:54,340 Tim Fulton: it's so I guess my question is, is, why for you? 267 00:14:54,340 --> 00:14:57,760 Was it education and empowerment? Was it simply like, 268 00:14:57,760 --> 00:15:01,380 well, there's grant dollars here for work? Of workforce readiness 269 00:15:01,440 --> 00:15:04,680 and, like, Why? Why education specifically, 270 00:15:06,960 --> 00:15:09,780 Cas Maxwell: a couple things. Yes, grant dollars, great. You 271 00:15:09,780 --> 00:15:14,880 know, no one hates that. Yeah, providing value at a no, no cost 272 00:15:14,880 --> 00:15:16,080 type thing is, and I want 273 00:15:16,080 --> 00:15:18,480 Tim Fulton: to give you a little bit of credit here and say that 274 00:15:18,480 --> 00:15:22,160 is good, right? You are doing a quote, unquote, you are doing 275 00:15:22,160 --> 00:15:26,240 good, like Superman does good. Yes, sure, but what? Yeah. Back 276 00:15:26,240 --> 00:15:28,460 to the question, why are you doing that? I 277 00:15:28,460 --> 00:15:30,440 Cas Maxwell: think the cool thing about the way this is 278 00:15:30,440 --> 00:15:34,820 structured is that we can be very flexible on what happens in 279 00:15:34,820 --> 00:15:37,820 the market. So when you look at an open AI, when you're talking 280 00:15:37,820 --> 00:15:39,920 about those big dogs that are going to eat up the, you know, 281 00:15:39,920 --> 00:15:41,900 the small guys, and all of a sudden you're going to have 282 00:15:41,900 --> 00:15:44,980 these conglomerates, that is going to happen for us. It 283 00:15:44,980 --> 00:15:49,000 doesn't necessarily matter who does that, right? Because at the 284 00:15:49,000 --> 00:15:51,340 end of the day, you know, what we're doing is we're taking 285 00:15:51,340 --> 00:15:53,980 these tools, whoever is the leader in the market, whoever 286 00:15:53,980 --> 00:15:56,380 has been vetted, whoever is proving themselves, and we're 287 00:15:56,380 --> 00:15:59,500 bringing that in a training platform. So to us, it doesn't 288 00:15:59,500 --> 00:16:02,760 matter who's juggled at the top, because at the end of the day, 289 00:16:02,760 --> 00:16:05,520 we're going to work with that person who's leading the charge, 290 00:16:05,520 --> 00:16:08,640 who's giving the not just the biggest, but who's giving the 291 00:16:08,640 --> 00:16:11,640 best value, and who's who's solving the most problems. And 292 00:16:11,640 --> 00:16:16,740 we're going to use that for our training. So is that, does that 293 00:16:16,740 --> 00:16:18,120 kind of question? 294 00:16:18,480 --> 00:16:22,520 Tim Fulton: It leads me to, what, how often are you guys 295 00:16:22,520 --> 00:16:26,000 having to update your materials? Basically, and be like, because 296 00:16:26,180 --> 00:16:28,460 there's always going to be some guy in the back of the room 297 00:16:28,460 --> 00:16:31,580 that's like, actually, that's not the best model to be using 298 00:16:31,580 --> 00:16:35,120 for that. Like, how often are you updating? Like, hey, we're 299 00:16:35,120 --> 00:16:41,020 going to be talking about chat GPT 4.0 versus Oh, three today. 300 00:16:41,140 --> 00:16:44,620 How often do you have to, like, update those models, or, sorry, 301 00:16:44,620 --> 00:16:45,880 those like modules, 302 00:16:45,880 --> 00:16:47,980 Cas Maxwell: right? I mean, we're updating it on a weekly 303 00:16:47,980 --> 00:16:52,300 basis, okay? But there's a caveat there. You know, we don't 304 00:16:52,300 --> 00:16:55,360 want to just throw the next best thing on the thing, okay, on the 305 00:16:55,360 --> 00:16:57,700 module, because what will happen is, you know, some of these 306 00:16:57,700 --> 00:17:00,040 companies have paid versions, as we all know, there's, I'm going 307 00:17:00,040 --> 00:17:02,940 to keep going back to chat GPT, because that's everybody knows 308 00:17:02,940 --> 00:17:04,920 that. Yeah, you know, you have the free version of the paid 309 00:17:04,920 --> 00:17:07,560 version, right? Okay, we know the paid version is worth it. 310 00:17:07,560 --> 00:17:10,500 You know, we've used it for several years now. It's, it's 311 00:17:10,500 --> 00:17:13,020 amazing. It's totally worth it. But, you know, you get these 312 00:17:13,020 --> 00:17:15,840 smaller AI companies that are startups, you know, they're 313 00:17:15,840 --> 00:17:19,860 coming in the market. Maybe they have a great product, but maybe 314 00:17:19,860 --> 00:17:22,020 they're not gonna be around in six months. And so the worst 315 00:17:22,020 --> 00:17:24,740 thing we can do is show is show a especially a school. Let's 316 00:17:24,740 --> 00:17:27,920 take that as an example. Show them a brand new, shiny tool 317 00:17:27,920 --> 00:17:31,460 that just came out. Yeah, and they sign a contract and spend 318 00:17:31,820 --> 00:17:37,340 $10,000 onboarding this AI, and it goes away in six months, or 319 00:17:37,340 --> 00:17:39,620 they go bankrupt. That's an excellent I mean, you know. So 320 00:17:39,620 --> 00:17:43,240 they're vetted as well, and their companies that we've 321 00:17:43,240 --> 00:17:45,520 identified are going to be around well, 322 00:17:45,520 --> 00:17:48,220 Tim Fulton: and so you, I mean, is it fair to say, in a small 323 00:17:48,220 --> 00:17:52,840 way, you are sort of feeding into that conglomeration, right? 324 00:17:52,840 --> 00:17:57,820 That you are empowering, you know, the Microsofts, the 325 00:17:57,820 --> 00:18:03,120 apples, the IBMs, by basically recommending that platform or 326 00:18:03,120 --> 00:18:07,080 that model. That's more of an observation than a question, 327 00:18:07,320 --> 00:18:13,380 yeah, what? In order to educate folks a little bit, what would 328 00:18:13,380 --> 00:18:18,000 you recommend for just a very generalist business, maybe like 329 00:18:18,000 --> 00:18:24,920 a consultancy, or, let's be more specific, like a PR firm. What 330 00:18:24,920 --> 00:18:29,360 would you recommend for them to do in order to choose which 331 00:18:30,920 --> 00:18:34,700 model or company to sort of like, okay, I'm gonna start 332 00:18:34,820 --> 00:18:38,900 subscribing to this service. How would you invite them to 333 00:18:39,260 --> 00:18:40,100 investigate it? 334 00:18:41,480 --> 00:18:43,720 Cas Maxwell: Well, make a choice outside of bringing us in, all 335 00:18:43,720 --> 00:18:47,560 right, okay, right, no. I mean, I think a great website, Product 336 00:18:47,560 --> 00:18:52,420 Hunt, okay, it's a really great website. They're, you know, 337 00:18:52,420 --> 00:18:55,300 keeping up on what's coming out every day, along with, with the 338 00:18:55,420 --> 00:18:58,600 steady tools that have been around for, you know, not just 339 00:18:58,600 --> 00:19:01,380 AI tools, with any tools, right? They have a ranking system. They 340 00:19:01,380 --> 00:19:04,320 have reviews. They have, you know, individual people, like, 341 00:19:04,320 --> 00:19:06,600 almost like a Reddit thread, that are going in and talking 342 00:19:06,600 --> 00:19:09,180 about it as well. That's a great website. I feel like to pull 343 00:19:09,180 --> 00:19:12,780 some knowledge from, obviously, YouTube, university, you know, 344 00:19:12,900 --> 00:19:17,400 going to YouTube, watching the videos, and pushing that free 345 00:19:17,400 --> 00:19:20,360 version. They have to the max before paying. I mean, it's, 346 00:19:20,660 --> 00:19:23,900 we've already been seeing companies like I said that they 347 00:19:23,900 --> 00:19:26,780 came out were great, but now they're folding up, and people 348 00:19:26,780 --> 00:19:29,120 were left, you know, holding the bag. And that's the last thing 349 00:19:29,120 --> 00:19:31,700 we'd ever want when suggesting things so 350 00:19:32,600 --> 00:19:35,240 Tim Fulton: well. And I think the interesting thing that is 351 00:19:35,240 --> 00:19:40,100 sort of happening is that it, when you say holding the bag, 352 00:19:40,100 --> 00:19:42,020 somebody may be listening. They'll be like, well, you just 353 00:19:42,080 --> 00:19:46,240 subscribe to a new service, and it's the issue is, is like, when 354 00:19:46,240 --> 00:19:50,980 you tell it something, it remembers it about you. And so 355 00:19:50,980 --> 00:19:53,980 if you're feeding it a data, feeding it data, or feeding it 356 00:19:53,980 --> 00:20:03,060 your sort of decision making ethos. You are. You're investing 357 00:20:03,060 --> 00:20:06,660 some time in sort of like getting it up to speed with what 358 00:20:06,660 --> 00:20:11,040 you need it to do. So, yeah, it's 359 00:20:11,040 --> 00:20:12,720 Cas Maxwell: an excellent point. I mean, that's the, what you 360 00:20:12,720 --> 00:20:14,820 just said there, that's the, that's the worst of it. It's not 361 00:20:14,820 --> 00:20:17,040 necessarily the money. Yeah. I mean, don't wrong, if you sign a 362 00:20:17,040 --> 00:20:19,860 huge contract, it's something, but if you're spending 20 bucks 363 00:20:19,860 --> 00:20:22,040 a month, yeah, it's not about the money. What it's about is, 364 00:20:22,040 --> 00:20:24,740 you just showed your entire organization. You did training 365 00:20:24,740 --> 00:20:27,800 on it. You're, you've built systems around it. I mean, I 366 00:20:27,800 --> 00:20:30,500 mean, you know, what do they say when you when someone gets let 367 00:20:30,500 --> 00:20:32,660 go and you gotta hire somebody else? I forget, how many 368 00:20:32,660 --> 00:20:35,240 onboarding, yeah, how much money it takes to bring on a new 369 00:20:35,240 --> 00:20:37,820 employee. That's pretty much what you just did. You just, you 370 00:20:37,820 --> 00:20:40,520 know, spent all that time and resources. Now you gotta redo 371 00:20:40,580 --> 00:20:42,820 it. And so nobody wants to have to do that. 372 00:20:43,480 --> 00:20:46,120 Tim Fulton: Talk to me about what you think is next for you 373 00:20:46,120 --> 00:20:46,600 guys. 374 00:20:48,700 --> 00:20:53,800 Cas Maxwell: What is next for us? Well, I think definitely on 375 00:20:53,800 --> 00:20:56,500 the school side of things. I mean, we want to continue down 376 00:20:56,500 --> 00:20:59,800 that path. We enjoy helping not only the teachers and 377 00:20:59,800 --> 00:21:03,660 administration, but the students of tomorrow. Mean, you know, a 378 00:21:03,660 --> 00:21:07,680 school is built around making sure that students prepared for 379 00:21:07,680 --> 00:21:11,460 the world outside of school. And so we are diving deeper into 380 00:21:11,460 --> 00:21:15,780 that. You know, we're working with some, you know, four year 381 00:21:15,780 --> 00:21:18,540 institutes, two year institutes as well after school to help, 382 00:21:18,780 --> 00:21:21,800 help guide them. So I think, you know, education is a huge part 383 00:21:21,800 --> 00:21:26,060 of ours, and then also hopefully expanding our reach a little bit 384 00:21:26,060 --> 00:21:29,840 as well, and outside of Ohio, or I think so, yeah, okay, I think 385 00:21:29,840 --> 00:21:32,720 we would like to, I mean, Ohio is our home and where we want to 386 00:21:32,720 --> 00:21:35,240 put all of our focus. But, you know, I think as the company 387 00:21:35,240 --> 00:21:39,020 grows, being able to have a wider impact is something that 388 00:21:39,260 --> 00:21:40,360 we would love as well. 389 00:21:40,900 --> 00:21:43,840 Tim Fulton: Can I give you an idea do it? Heck, I love that. 390 00:21:44,080 --> 00:21:47,860 If you pursue it, I have it documented here that I sorry, 391 00:21:47,860 --> 00:21:51,520 yeah, May I share an idea with you? Yes? So I want, if you do 392 00:21:51,520 --> 00:21:55,120 it, I want to get royalties. Well, yeah, it doesn't have to 393 00:21:55,120 --> 00:21:57,580 be. I just want to help. Okay, you know, so when I was in 394 00:21:57,640 --> 00:21:59,920 middle school, there was this program, and it's kind of a 395 00:21:59,920 --> 00:22:03,180 nascent it does still exist. I just looked it up a couple of 396 00:22:03,180 --> 00:22:06,120 weeks ago, and it was called future problem solvers, and it 397 00:22:06,120 --> 00:22:13,620 was a competition similar to a spelling bee, or there's other 398 00:22:13,620 --> 00:22:17,760 various programs like it, but essentially a region of middle 399 00:22:17,760 --> 00:22:21,620 schoolers. You have a team and a couple of alternates that like, 400 00:22:21,620 --> 00:22:24,740 go from your school and everybody goes to a big 401 00:22:24,740 --> 00:22:27,920 gymnasium or conference center or something, and at the 402 00:22:27,920 --> 00:22:30,860 beginning of the morning they give them basically a two page 403 00:22:30,860 --> 00:22:35,780 brief on, hey, how would you solve this problem and that 404 00:22:36,320 --> 00:22:42,280 maybe, how Do you get more trees sufficiently built in a 405 00:22:42,280 --> 00:22:47,020 community that has very limited green space. And here are the 406 00:22:47,020 --> 00:22:50,080 constraints of this problem, and here's the parameters that you 407 00:22:50,080 --> 00:22:53,200 need to stick in between. Like, this can only be a five year 408 00:22:53,200 --> 00:22:58,660 solution or a 10 year solution, and then they you basically gave 409 00:22:58,720 --> 00:23:01,500 give the kids four hours to figure it out. So it's a team of 410 00:23:01,500 --> 00:23:04,080 like, five kids. They're like, Well, what about this? What do 411 00:23:04,080 --> 00:23:07,620 we do with this? And then those alternates are also competing 412 00:23:08,160 --> 00:23:12,180 individually on similar problems. And then the judges, 413 00:23:12,180 --> 00:23:15,480 everybody goes to lunch, the judges read all the responses, 414 00:23:15,480 --> 00:23:18,300 and they award what they think is the best solution, and 415 00:23:18,300 --> 00:23:21,740 that's, that's the winner. Basically. What if we could do 416 00:23:21,740 --> 00:23:27,020 that with middle schoolers and AI and sort of present a here's 417 00:23:27,020 --> 00:23:31,460 the problem. How would you solve it with AI and what like and and 418 00:23:31,460 --> 00:23:34,820 here are the constraints under which you're going to do it and 419 00:23:34,820 --> 00:23:40,100 just sick the kid on a computer to sort of figure it out and 420 00:23:40,400 --> 00:23:44,020 make recommendations for, like, maybe business plan around it, 421 00:23:44,020 --> 00:23:48,640 or because I do believe that you're right, that like, this 422 00:23:48,640 --> 00:23:54,280 is, I think Obama famously said it like he went into a room of 423 00:23:54,400 --> 00:23:57,400 pretty smart people and he said, Okay, how should I think about 424 00:23:57,400 --> 00:24:01,200 this? What is this? Is this the internet? Is this manufacturing? 425 00:24:01,440 --> 00:24:06,300 And they said, This is electricity. This is how much of 426 00:24:06,300 --> 00:24:09,000 a change this is going to be to our society, and kind of 427 00:24:09,000 --> 00:24:15,420 quickly. Wouldn't it be great to one come up with some cool ideas 428 00:24:15,420 --> 00:24:20,720 with these kids and also empower them, and you know, then you're 429 00:24:20,780 --> 00:24:23,480 looking at a kid who's gonna be able to put that on his college 430 00:24:23,480 --> 00:24:27,020 resume and not just a credential. You know what? I 431 00:24:27,020 --> 00:24:29,660 mean, yeah, so that's the idea. So, 432 00:24:31,640 --> 00:24:34,880 Cas Maxwell: well, I'm ahead of you already. So, yeah, I don't 433 00:24:34,880 --> 00:24:37,700 want to let the cat too much other bag, okay, but we're 434 00:24:37,700 --> 00:24:42,100 working on some festivals next year, okay, around the state 435 00:24:42,880 --> 00:24:46,660 that are going to do pretty much exactly that. Okay? What we're 436 00:24:46,660 --> 00:24:48,760 going it's not gonna be middle school. It's gonna be nine 437 00:24:48,760 --> 00:24:51,880 through 12. We're working our way back through grades, though. 438 00:24:51,940 --> 00:24:55,600 We're finding some solutions for that middle school, that grade 439 00:24:55,600 --> 00:25:01,020 school, to get them in earlier, but the how? Uh, kind of what 440 00:25:01,020 --> 00:25:03,420 the vision for next year is to be putting these students 441 00:25:03,420 --> 00:25:06,660 through that semester long course. Okay, talking about and 442 00:25:06,660 --> 00:25:09,660 during that, it's all project based learning, right? So what 443 00:25:09,660 --> 00:25:12,360 engages the student on their day to day level, you know? Whether 444 00:25:12,360 --> 00:25:15,240 it's, you know, building a, I don't know, let's just say a 445 00:25:15,240 --> 00:25:18,300 chat bot for the website, whether it's building a vision 446 00:25:18,300 --> 00:25:21,440 system, maybe it's a, you know, we had one student build a Red 447 00:25:21,440 --> 00:25:24,380 Dead Redemption guide, you know, really in depth guide, you know, 448 00:25:24,380 --> 00:25:27,200 that went through the game, you know, whatever they want to do. 449 00:25:28,640 --> 00:25:31,760 And then basically creating these festivals for these 450 00:25:31,760 --> 00:25:35,000 students to prop these up, basically an AI science fair, 451 00:25:35,000 --> 00:25:37,880 exactly. And then having those demos there, having that judging 452 00:25:37,880 --> 00:25:42,400 there, having the collaboration, the problem solving at these 453 00:25:42,400 --> 00:25:45,760 festivals, and then really what we're doing as well is we're 454 00:25:45,760 --> 00:25:48,340 hiring a bunch of high school tech interns throughout Ohio, 455 00:25:48,400 --> 00:25:50,740 okay, and we're giving them a job within our company. We're 456 00:25:50,740 --> 00:25:52,840 doing the same thing with college tech interns, bringing 457 00:25:52,840 --> 00:25:55,480 them into the company. They're actually then working with these 458 00:25:55,480 --> 00:25:58,060 businesses that we're talking about, okay, finding these 459 00:25:58,060 --> 00:26:00,480 problems that businesses are having, then helping, and 460 00:26:00,480 --> 00:26:02,880 they're creating these solutions, almost like a little 461 00:26:02,880 --> 00:26:05,100 incubator, yeah, and then, you know, if the solutions are 462 00:26:05,100 --> 00:26:08,100 great, it gets spun out into a company. Those interns get to 463 00:26:08,100 --> 00:26:10,920 own part of that company, and they get to help drive that 464 00:26:10,920 --> 00:26:15,600 forward, keeping it in Ohio, growing the Ohio economy. So 465 00:26:15,600 --> 00:26:18,000 we're really looking at that because, you know, it all ends 466 00:26:18,000 --> 00:26:21,080 with the student. I mean, yeah, it's so many people want to talk 467 00:26:21,080 --> 00:26:23,060 about, when you do this for the districts and the 468 00:26:23,060 --> 00:26:25,520 superintendent, what it's the end dates, about the student, 469 00:26:25,520 --> 00:26:27,440 you know what? Right? What is their outcomes? How are they 470 00:26:27,440 --> 00:26:30,500 being prepared? And how do we show that off, instead of 471 00:26:30,500 --> 00:26:33,200 keeping them behind that, that closed door, that desk? So 472 00:26:33,380 --> 00:26:35,960 that's a huge Yeah, thank you for bringing that up, because 473 00:26:36,020 --> 00:26:38,720 that, yeah, that is a big thing that we want to, we want to do. 474 00:26:38,720 --> 00:26:41,260 And so I'm glad you're no, they're aligned on that, yeah, 475 00:26:41,320 --> 00:26:44,560 definitely, that's great. Real quick. Were you in that middle 476 00:26:44,560 --> 00:26:47,500 school? Were you, what was your What was your project? 477 00:26:47,500 --> 00:26:51,100 Tim Fulton: I do not remember. I know that I got first place as 478 00:26:51,100 --> 00:26:54,160 an alternate, okay, all right, which means, yeah, I was not 479 00:26:54,160 --> 00:26:57,880 chosen to be on the team, okay, but I was the best, well, but I 480 00:26:57,880 --> 00:27:01,680 was the my team didn't place at all, okay, but I was the best, 481 00:27:01,740 --> 00:27:05,100 oh, I got you all the, all of the, I was the best of the last 482 00:27:05,100 --> 00:27:08,040 kid that got chosen for kickball. Gotcha among everybody 483 00:27:08,040 --> 00:27:11,280 in the state, yeah. So I have that to put my, you know, I can 484 00:27:11,280 --> 00:27:12,180 hang my hat on that. 485 00:27:12,240 --> 00:27:14,280 Cas Maxwell: I think my only I'm trying to think, like, science 486 00:27:14,280 --> 00:27:18,240 stuff. I think I won. I had the fastest mouse trap car. Okay, I 487 00:27:18,720 --> 00:27:21,560 don't know why that came to my that's good. I was like that, 488 00:27:21,740 --> 00:27:22,700 yeah, well, I 489 00:27:22,700 --> 00:27:24,920 Tim Fulton: mean, they're, they're, they're the guys who 490 00:27:24,920 --> 00:27:28,040 talk about what it was like quarterbacking for the big game, 491 00:27:28,040 --> 00:27:32,120 and there's the guys that talk about their Eagle Scout project, 492 00:27:32,120 --> 00:27:35,060 Cas Maxwell: yes, and we have these. That's absolutely, 493 00:27:35,060 --> 00:27:36,020 absolutely, 494 00:27:36,500 --> 00:27:38,120 Tim Fulton: I end every interview with the same two 495 00:27:38,120 --> 00:27:41,800 questions, what do you think Columbus is doing well. And what 496 00:27:41,800 --> 00:27:44,020 do you think Columbus is not doing so well? 497 00:27:45,760 --> 00:27:47,560 Cas Maxwell: That's a tough one. I wish I had time to prepare 498 00:27:47,560 --> 00:27:50,320 that one. Well, if you would listen to the podcast before 499 00:27:50,320 --> 00:27:53,680 True, true. I did. I did listen to the one you put on a month 500 00:27:53,680 --> 00:27:58,720 ago. What is Columbus doing? Well, 501 00:28:00,880 --> 00:28:03,480 Tim Fulton: this could even be a here's what I like about 502 00:28:03,480 --> 00:28:04,680 Columbus. Yeah. 503 00:28:06,720 --> 00:28:12,240 Cas Maxwell: I mean, I love Columbus's energy. I do not just 504 00:28:12,240 --> 00:28:18,360 energy in people. But, you know, I I'm not from here. You know, I 505 00:28:18,360 --> 00:28:22,520 grew up coming to Columbus once every year, maybe once every 506 00:28:22,520 --> 00:28:24,860 year and a half, okay, occasional Eastern trip or 507 00:28:24,860 --> 00:28:27,260 whatever. Yeah. And, you know, pulling in, seeing this guy 508 00:28:27,260 --> 00:28:29,660 here, you know, as a kid, is like, Oh, my God, we're in the 509 00:28:29,660 --> 00:28:33,140 city, yeah. And so now that I live here, I don't take for 510 00:28:33,140 --> 00:28:36,680 granted every day that I do live here, and I and I see the 511 00:28:36,680 --> 00:28:39,140 cranes, and I see the people and the people moving and, you know, 512 00:28:39,140 --> 00:28:41,920 we were talking about venture and stuff like that, and 513 00:28:41,920 --> 00:28:44,920 learning about all these new businesses and and I just think 514 00:28:44,920 --> 00:28:48,700 there's just so much energy. And I don't know that's what I love 515 00:28:48,700 --> 00:28:50,920 about it. I just love the energy. Yeah, and we're in the 516 00:28:50,920 --> 00:28:54,460 Midwest, so people are nice and welcoming. And, yeah, 517 00:28:54,580 --> 00:28:57,220 Tim Fulton: and what do you not like so much about it? What am I 518 00:28:57,220 --> 00:28:58,300 not like about it? 519 00:29:02,620 --> 00:29:06,060 Cas Maxwell: I don't like the traffic going out of Columbus at 520 00:29:06,240 --> 00:29:09,660 5pm going there, going back east, all right, that's what I 521 00:29:09,660 --> 00:29:12,840 don't like. That's fair. Yes, Cass, thanks for your time. 522 00:29:12,840 --> 00:29:14,820 Thank you so much. We appreciate it. 523 00:29:22,860 --> 00:29:25,820 Unknown: You thank 524 00:29:25,820 --> 00:29:28,040 Tim Fulton: you for listening to Confluence cast presented by 525 00:29:28,040 --> 00:29:31,580 Columbus underground again. You get more information on what we 526 00:29:31,580 --> 00:29:34,160 discussed today in the show notes for this episode at the 527 00:29:34,220 --> 00:29:39,140 confluencecast.com Please rate, subscribe, share this episode of 528 00:29:39,140 --> 00:29:42,880 The confluence cast with your friends, family, contacts, 529 00:29:42,940 --> 00:29:46,780 enemies, your favorite prompt engineer. If you're interested 530 00:29:46,780 --> 00:29:49,840 in sponsoring the confluence cast, get in touch with us. We 531 00:29:49,840 --> 00:29:54,220 can be reached by email at info at the confluencecast.com our 532 00:29:54,220 --> 00:29:57,580 theme music was composed by Benji Robinson. Our producer is 533 00:29:57,580 --> 00:30:01,440 Philip Cogley. I'm your host. Tim Fulton. Have a great week. 534 00:30:01,440 --> 00:30:01,560 You. 535 00:30:20,000 --> 00:30:20,120 Unknown: You.