Taking The Supply Chain Pulse

Dr. Randy Bradley on Building a Future-Ready Healthcare Supply Chain

May 16, 2024 St. Onge Company Season 1 Episode 9
Dr. Randy Bradley on Building a Future-Ready Healthcare Supply Chain
Taking The Supply Chain Pulse
More Info
Taking The Supply Chain Pulse
Dr. Randy Bradley on Building a Future-Ready Healthcare Supply Chain
May 16, 2024 Season 1 Episode 9
St. Onge Company

Join Dr. Randy Bradley, a distinguished University of Tennessee professor and an authority in supply chain and healthcare IT and I as we discuss current healthcare supply chain issues and challenges. Our conversation reveals the connection between local operations and the global supply chain, emphasizing the resilience necessary in the face of inflation, geopolitical strife, and crises like pandemics. Dr. Bradley sheds light on the critical shortage of skilled professionals in the industry and why domain knowledge is paramount, guiding us toward a more agile and fortified healthcare supply chain.
 
Our episode culminates with a critical look at AI's role in the workforce and education. We explore the parallels between the music industry's evolution and the debates surrounding AI-generated content, pondering the responsibilities corporations have in retraining workers. Furthermore, we discuss how students should harness AI tools like ChatGPT without compromising their judgment, especially in sensitive sectors like healthcare. We're navigating uncharted territory where the fusion of human ingenuity and artificial intelligence sets the course—join us for this compelling voyage into the future of healthcare and technology.

Send us a text

Show Notes Transcript Chapter Markers

Join Dr. Randy Bradley, a distinguished University of Tennessee professor and an authority in supply chain and healthcare IT and I as we discuss current healthcare supply chain issues and challenges. Our conversation reveals the connection between local operations and the global supply chain, emphasizing the resilience necessary in the face of inflation, geopolitical strife, and crises like pandemics. Dr. Bradley sheds light on the critical shortage of skilled professionals in the industry and why domain knowledge is paramount, guiding us toward a more agile and fortified healthcare supply chain.
 
Our episode culminates with a critical look at AI's role in the workforce and education. We explore the parallels between the music industry's evolution and the debates surrounding AI-generated content, pondering the responsibilities corporations have in retraining workers. Furthermore, we discuss how students should harness AI tools like ChatGPT without compromising their judgment, especially in sensitive sectors like healthcare. We're navigating uncharted territory where the fusion of human ingenuity and artificial intelligence sets the course—join us for this compelling voyage into the future of healthcare and technology.

Send us a text

Speaker 1:

Hello again everybody. This is Fred Kranz with another episode of Taking the Supply Chain Pulse. Today, we have as our guest Dr Randy Bradley from the University of Tennessee, knoxville. But before we start, I'd like to ask you if you like what you've been seeing so far, please subscribe to our podcast and if you have comments, make some comments to tell us what you've been seeing so far. Please subscribe to our podcast and if you have comments, make some comments to tell us what you'd like to see, subjects you'd like us to cover, and if you or anyone you know would like to be a guest, let us know that too, because we're always looking to expand the people that we talk to.

Speaker 1:

It doesn't need to be the giant supply chain leader. It could be someone that works in the storeroom that you think has done some stuff that we should be talking about. So we're looking for whatever input you can give and we appreciate it. That said, today I am happy to have as a guest Dr Randy Bradley. Dr Randy Bradley was the winner of the 2020 AMRA Award from the Bellwether League, the National Healthcare Supply Chain Hall of Fame, and he is an industry leader teaching at the back of this picture. On the left side there's a rotating Auburn University football helmet, because that's where Dr Bradley matriculated before he hi, that's a pretty good word for an old guy like me that's a good word, yeah.

Speaker 1:

Before he got his advanced degree. So, Dr Bradley, thanks for being here. Why don't you tell us a little bit about your background and how you got to where you are today?

Speaker 2:

Sure thing, Fred. Thank you for having me. It's always a pleasure to talk with you, and now we get to bring everyone else into conversations that we usually have on the side. So, as you indicated, I am a professor at the University of Tennessee. I'm an associate professor. I've been here now for about 18 years. Prior to that, I was an IT consultant, worked for a big box consulting firm, as we used to say, and since my time outside of that realm, I run my own public speaking and consulting firm where I work with companies around the world in terms of helping them with supply chain strategies, IT strategy, analytic strategies, as well as digital transformation and digital road mapping, or essentially, defining, creating and executing their digital journey. In my spare time, I also run another entity with a urologic oncologist where we develop healthcare IT solutions to improve the way healthcare providers communicate, collaborate and coordinate efforts to solve what we call mega problems in the healthcare space, primarily trying to address issues of health disparities, inadequate access to sufficient healthcare. So those are things that I do on the side.

Speaker 1:

Wow, that's pretty impressive. You said that in about a minute. You know it would have taken me about 10 to get all that out. So, from your perspective most of us in the healthcare supply chain we're working in a very narrow niche. We're basically concerned with what's going on right in front of us. We don't see the bigger picture. You have the luxury and the responsibility of seeing the bigger picture and sort of bringing what's going on on the outside to bear on what's going on with us. So as you take a look at the supply chains in general and the healthcare supply chain in particular, what trends out there do you see that are going on that we need to be looking out for and that we need to be dealing with in the near and intermediate future?

Speaker 2:

Yeah. So I'll take it in a couple of ways. One I can say the challenges we're seeing are all too familiar. We're both sides and when I say both sides health care and everything else we're dealing with issues related to price increases, primarily driven by inflation, and geopolitical challenges. I think that's top of mind for most supply chain leaders, or in a different industry. You have to be what I call glocal, which is you have to be globally oriented and mindful, aware and astute, even if you're executing on a local level.

Speaker 2:

The days of saying here's my space, here's my sphere is long gone. We are part of a global supply chain, whether we choose to be part of it or not. So those price increases and challenges are there. Manufacturing device companies, pharmaceutical companies are dealing with that and they're not going to eat that cost. And then when we start to see disruptions in the supply chain and at one point we saw those disruptions as things that were triggered by a pandemic Prior to that, we saw that triggered by natural disasters or human-made disasters you have the issue in Baltimore with respect to the bridge. There is going to be tremendous fallout from that and one of the things that's going to come, or one portion of fallout is going to be issues with price increases due to the cost of executing. We've got to share and I say we, those who had products on that vessel, are going to share in the cost of reconstituting that vessel and recovering those items. I mean, that's just simply what happens in maritime situations, and so those are things we may not see for three, four or five years. We may look down the line and say, why are we experiencing cost increase, why are our margins being decreased? And we're going to realize it was something that happened three, five years prior to that. So I always tell people pay attention to what we call tremors, because tremors ultimately become global and, in many cases, large-scale earthquakes for those of us in this particular supply chain space.

Speaker 2:

Another challenge that we're seeing that is part and parcel affecting all organizations, but I think it's even more paramount in healthcare is talent shortages. The inability to hire, attract and also maintain the talent that you bring into your organization. Now that's big scale. What does that look like for healthcare? Healthcare is twofold. It's not just a shortage, it's also a skill deficit, because we have been conditioned to focus primarily on does an individual have healthcare experience, rather than do they have domain expertise, and so what we've settled for is I want you to be in my industry, even if you know nothing about the role, and that has been a linchpin for a long time. And it's coming to the point now where, because everything is evolving around us, we can no longer acquiesce to that particular mindset. And chief supply chain officers if you want to be that, you've got to act like that. I always say if you want to sit at the big person table, well then, don't sit there and tuck your napkin in your shirt. So let's act like we belong to be at the table.

Speaker 1:

I got to jump in here on that, you know, while I tuck my napkin in my shirt my napkin.

Speaker 1:

But you know, the fact of the matter is that for the longest time you didn't need to know much to be a healthcare supply chain leader because we didn't have a real supply chain.

Speaker 1:

We outsourced the purchasing and contracting component to the GPOs and we outsourced transportation and logistics to the MedSurg distributors. So if you were to walk into an organization and you would find that the person leading the supply chain probably had some kind of high level formal education, maybe in supply chain, oftentimes not. I was a history major at the University of Miami. I learned about supply chain by stocking the shelves in the hospital that I was working at, and so we never had to have any expertise about running supply chains. Now supply chains need to have people who know how to run supply chains, running them, and they also have to have more than one person in the operation that has expertise and talent and the skill sets that are necessary. So, as you said, the analytical skills, the various skills in transportation, logistics and all kinds of operational stuff, it just isn't there. And, to make it worse, historically healthcare organizations have been really tight with the money and not willing to spend to get those type of people. Would you say that's fair.

Speaker 2:

Oh, I fully agree with you and I love the point that you made, which is there was a time where supply chain wasn't a formalized discipline. There were not degree programs focused on supply chain. The interesting parallel is that that was the case for every industry, not just healthcare. So why is it that other industries have evolved in terms of their approach to that, but in healthcare we still follow that same paradigm, whereas most of our leaders are there by way of attrition? It's about longevity, not necessarily because I'm actually skilled in that domain. I may be more skilled than those who work under me, but that's a fairly low bar to get over. And you're right, part of that is about the funding mechanisms. I'll give you an example, fred.

Speaker 2:

I was trying to assist a health care organization once get supply chain talent, and when they sent out the job description, that thing was 15 years old and hadn't changed, and it was an hourly wage. I said there is no person that I know of who's coming out with a formal degree in this discipline. That's going to take an hourly wage. I mean, it just doesn't happen. And so, again, our hiring practices, our policies and our procedures we haven't evolved as other industries have evolved. And so when we talk about why are we behind? Again, it's because we've always said one thing, which is even similar to life sciences, which is we're different and because we're different, we're special. And because we're different and because we're different, we're special, and because we're different and special, we're still behind.

Speaker 2:

Yeah, and it's one of those things. Well, we have to realize we play in the same space as every other entity and I know, when I talk with companies, one of the things I try to tell them no one is choosing you because of your industry. They're choosing you because they believe in the opportunity. And I love when you started this about the mission Do we have a talent base that's coming up, that is mission oriented and mission focused and are willing to forego the highest salaries? And I would say yes, and I won't even say that it's been coming up. I think that has always been present. The problem is, we have always led with well, we can't afford to pay you what, everything, what everyone else can afford to pay you when you lead. With that, you've really created this tension in this, in this, in this conversation, that caused the person to sit back in their seat rather than saying here's what I believe you can come in and help us do.

Speaker 1:

We don't even lead with the mission, we lead with the low cost right, yep, and you know, the other day this is, it's interesting, uh, because you said something about low cost, which, uh, triggers something. I was at a meeting the other day about robotics, okay, and and this company representing robotics was talking about all the the different types of vehicles and what those vehicles were doing, and I typed in a question about this one large health system and I said how many FTEs did these things eliminate? And the company that was selling the robotics said, oh, we're not there to eliminate FTEs, we're there to improve operations. So they just transferred the FTEs to doing other things. And I wanted to type in big letters it's always about the FTEs.

Speaker 1:

And two days later, I swear to God, two days later, a national publication says that that same large IDN laid off a thousand people, most of which were not patient facing, most of which were in the support. So so it does, it gets. It's always about, it's always about controlling costs and and and and reducing or curbing or right-sizing the number of FTEs you have in the organization and that's, that's that. That has not changed since I've been around and I doubt if it will ever change. So what, how do we go about mitigating these, uh, these circumstances, what? What do we do to to create a successful future for folks in the supply chain?

Speaker 2:

Now, that's interesting, fred, because when you were talking about sort of that headcount reduction, or right-sizing the workforce, as we oftentimes like to call it, you might be aware there was a recent release of a study that Gartner conducted with I think it was, about 175 supply chain leaders and they were asking them about the role of generative AI. In particular, they were focused on the supply chain. The thing that was interesting is that when they looked at the percentage of these leaders who said that they planned headcount reduction primarily due to their use and adoption of generative AI, 81% of those said that it's going to happen in the supply chain 81%. That was higher than any other functional area, and this is what they're telling us. This is their intent. And then when you start looking at, well, what percentage of your workforce do you expect that to be on average? And then again, supply chain again was at the top, where they were looking at somewhere between 7% to 8% of their workforce. They're expecting that that's actually going to go down. And so that comes back to your point, which is there are two points that I think we'll get to.

Speaker 2:

One is how do we mitigate this? What I found is this Generative AI, like any other technology is an inanimate object or phenomenon. It in and of itself does nothing to mankind. It's those who leverage it and how we leverage it and for what purpose we leverage it, and what I found is it is a technology that is grossly misunderstood and, as a result of that, it's like putting a firearm in the hand of a novice. Bad things are going to happen at some point in time if we don't properly train them, and that's so. So one way is we have to make sure that people understand what is generative AI. Where can it be valuable? But, at the same time, how could we create some unintended consequences because of our lack of understanding and appreciation for what it can be doing and how it's done? Because I always tell people this any large language model that you bring into your house was in someone else's house first. I mean, it was trained somewhere else by someone else on something else, and so, even if your data and your approach is not biased, someone else has already biased it for you and if you're not aware of that level of bias that is inherent, you're going to cause, you're going to exacerbate that bias and, as a result, you're not going to get the value that you want and you think it's about replacing people.

Speaker 2:

I'm always a fan of this. I believe technology should allow us to do two things. It should allow us to. It should feel and not steal. It should feel gaps in time, gaps in bodies, and it should feel gaps in knowledge. It should not. And so what that means is this maybe I can't. We've already said there's a challenge with finding enough people. So if that's the challenge, why would I try to get rid of people If, logically, that makes absolutely no sense? I don't have enough people, so I'm going to let go of some. I mean, who does that? But yet it baffles me and I often say is there a corporate brain drain? Do we get into the situation where there's so much group think to where no one ever thinks about the logical approach to really making decisions anymore?

Speaker 1:

the logical approach to really making decisions anymore. Well, you're hitting on a lot of things that you just wiped out a page I had over here. I was going to talk about Mo Gaudet, the former chief business officer for Google X, and he said number one we don't understand half the things AI is doing. Number two the speed at which AI is learning is petrifying. And Stephen Hawking, in 2018, said there is a greater danger from AI if we allow it to become self-designing, because then it can improve itself rapidly and we will lose control. So you've talked about all that. And number the next thing and God said it a couple of different ways, but bad actors are a problem.

Speaker 1:

Nobody no individuals were able to create nuclear weapons. It took large resources. You had to get U-235, which was difficult to get A lot of things but you can take five smart guys in the basement and wreak havoc upon the universe if they got a big enough system and access to enough data. And since there's money to be made, God said that it's not human incompetence that will cause our downfall, it's human greed. And to me, to me, that's that's what it's. It scared me over the last few weeks. The more I've been reading, the more frightened I get, because there's a, there's a load of money to be made out there and and someone can literally rule the world if they win this competition.

Speaker 2:

Yeah, and for one of the things that I don't think we've spent enough time talking about in this space, whether we're talking generative AI or any other form of AI is this when and where does intellectual property still become something that we protect? Because, if you think about it again, you're training these models on things that you don't own. You're recreating, adapting and changing aspects, processes and, in some cases, inventions that you don't have the right to do, and so it's one of the things I've always. I remember one of the things I was looking at in what I was in my master's program many years ago is what do we do when our legal system, in terms of its pillars, doesn't move at the pace that technology advances, and so we're always behind? Whether we go back to the whole thing when you basically had music that was being streaming from these peer-to-peer networks, but yet the artists weren't benefiting from that because of how it was being done.

Speaker 2:

It took us a while to rectify that, and then that actually led into another era where in which we had streaming music services and you paid for that, and some of that was actually shared Royalties were shared with the artists. But how are we doing that now, when we're regenerating things when I say, create an image that has Fred Crand playing baseball wearing a Lindor jersey, and then I give him Lindor's voice and I give him Lindor's swing, right. So where does Lindor end and Fred Crand begin? And so again we're seeing this whole thing in image and likeness as well. There was recent cease and desist issues in the music world because one artist decided to put out a set of lyrics but he used the voice and image of a dead artist, and so there is state symptom of cease and desist.

Speaker 1:

There you go and this these are simple things, you know, compared to being able to my fear and you hit on something that I think we need to get at, and that is the responsible organization. Yes, and that gets down to this. If I know that the guys that are moving the boxes through the warehouse, the guys that are unloading the trucks at the dock, can be effectively and operationally replaced by robots and machines, then my responsibility is to develop a program in which I can migrate these people into doing as you say, filling some of the gaps that are out there that are not able to be filled right now. And I also think that the real danger is when things happen to people. They don't see it coming.

Speaker 1:

Ai is not like the fear that I had when I was a kid, where we would duck and hide under the desk and put our hands over our head because the civil defense required us to do that once a month. Ai is, you don't see it. Many people don't know that their jobs may be gone in four or five years, and where is the responsibility, the corporate responsibility, for other human beings to create alternative, uh, life paths for those people? You know?

Speaker 2:

well, see, fred, you're. You're touching on on the empath pieces, right, which is, but the Assumption is that they care, and it goes back to your other point that that's the presumption here, that we actually care about the individuals who are in our organization. It's amazing, the same people we call superheroes Few years ago are the same people were pushing out the door now.

Speaker 1:

Yeah.

Speaker 2:

For the sake of efficiency and cost savings. And I often tell people because we hear it all the times and it becomes corporate speak which is value added activity or value added work is what we're freeing them up to do. And then, when you push, well what does that look like? And I always try to help them. It looks like this make a list of the things that you know you need to be doing but you can't do because you don't have the time to do it or you don't have the bodies to actually do it. Those are value adds. Those are things that you reposition, reallocate your workforce to tackle, because those are things that make a difference and they're part and parcel to your mission and the objectives of your organization. They're going to help you better to serve both your customers, your consumer and your other internal stakeholders.

Speaker 2:

And that's why I say but you said something earlier when you said it's about it's the challenges when it's greed, right, there's nothing wrong with chasing optimal operational performance, but when I'm willing to do that at the expense of someone else's livelihood, then that's when we're willing to cross any line. And sometimes it's not even about morality. It really becomes a form of ethics and we talk about ethical AI, and again I often say you cannot make AI ethical AI, and again I often say you cannot make AI ethical. The people who deploy it need to be ethical, and if we don't start there, with the human portion of it, the technological piece, again it's nothing more than an apparatus that's unleashed.

Speaker 1:

Yep, and that's not. You got another Mo Gawdat thing in here. He says AI doesn't care about ethics or morality. What humans teach it and what it teaches itself is what it's about. And the real, the real scary part that that Gaudet and a few of these other folks get into is that the machines are teaching themselves.

Speaker 1:

It's it sort of gets down to what was that 2001,. A space odyssey where hal the computer, took over everything and and and all. That seemed, uh, really ridiculous and futuristic at the time, but but it's very, it's a very real possibility uh, these days, and, and you know. That being said, if my goal when I was a undergrad was to become a university professor, I wanted to be a history teacher, okay, and I often thought that if I were a professor, today all the kids are turning in papers that are perfectly typed and spaced and spelled correctly because they're done on computers. And now the kids don't even have to write their own stuff. They can just go to chat GPT and type in some parameters and come up with some stuff. How do you deal with that?

Speaker 2:

My approach is a little different than I think some others. I actually encourage students to explore with chat GPT, but I also tell them this You're responsible for what it produces, particularly if you turn it in. And so I look at it this way. View it no differently than you would view any other source for your information, cite it as a source, but at the same time, even if I cite something and I say it was written by Smith et al 2023, well, I better know what Smith et al actually said. I better know what it means and how it's relevant.

Speaker 2:

So just because ChatGPT gives you a paragraph, it gives you a page, doesn't mean you can copy and paste it, because here's the thing how much of that did it actually lift from someone else's work? And so now we've got this whole issue where plagiarism is real, even if it's been automated. So plagiarism is plagiarism. Those concepts don't change, and I think it's been automated. So plagiarism is plagiarism. Those cons, those concepts don't change. And I think it's helping students understand that, that you're responsible for plagiarizing other works, even if someone else told you here's what you should write yep, well, when you think about it.

Speaker 1:

Uh, due to the way it works, uh, ai, uh and chat, it's all plagiarism because, because the machines are accessing this gigantic database and the day the database contains the work of other people other people so and oftentimes without attribution yeah, and, and one of the things that I terms that I learned while I've been into this is a term of hallucinations. Have you ever heard of that while I've been into this is a term of hallucinations.

Speaker 2:

Have you ever heard of that? Tell us about that. I love this, yeah. So AI hallucinations is when AI gives you errant or results that actually are not factual, and primarily, it's like this. We actually play a game with this with some of our executives in our executive programs. Fred, what I'll do is I'll have you say a word and you tell that word to the next person, and the next person has to draw an image based on what they heard you say. They show that image to a third person who has to tell them what that image reflects and then, once that person tells them what it reflects, they then have to have a fourth person try to draw what they're saying.

Speaker 2:

And what happened is there's a degradation over time, and AI works the same way. I feed it one thing If its conclusion that is drawn is wrong or incorrect, it doesn't know that. I always say this here. Your data is always perfect to any type of AI model. You know it's not perfect, but the model doesn't know it's not perfect and it's always going to continue to evolve and evolute based on this. So what's going to happen is those things that were given to you erroneously. It's going to now continue to propagate those. Everything else is going to be an iteration of that, which means it's just going to become more wrong and it's going to become farther from the truth, and that's what we mean when we talk about AI hallucinations.

Speaker 1:

Yeah, that was an interesting thing when I found it. So you have to have checkers, there have to be auditors out there of any input that you get from AI. Ai can help you do research. It can give you ideas about if I wanted to write an article about about supply chain. It can give me ideas that I can then pursue. But it's up to me to make sure that I just don't take this as uh as um, being a verse from the bible, uh and uh and being scripture and then publishing it so so.

Speaker 2:

So it goes back to something else you said a moment ago, fred, which is why I believe that when we're chasing this elimination of roles, what we really should be saying is now domain and subject matter. Expertise is going to become more important than it ever was, because if you don't have it, how can you know that what that solution is giving you is wrong?

Speaker 2:

Right, you won't Right, you're just going to accept it as fact and you're going to move forward. So when you eliminate that knowledge base from your organization, you have now created greater liability in the long run.

Speaker 1:

Yep, and so there is hope. There's still room for the humans out there, right?

Speaker 2:

Yeah, you know what? And I'm not one of those people who where I'm not against AI, I'm not against generative AI, but at the same time I'm not against humans either against AI, I'm not against generative AI, but at the same time I'm not against humans either. I actually believe and you have heard me say this phrase before, and it's actually a book that I'm working on, fred, and it's called the Manchine Movement and it's mankind plus machine, and it's written from the perspective of really understanding that, as we move forward, we have not been able to get optimal performance from human beings because we're not wired that way. There's a reason. We're not robots, but at the same time, you can get optimal performance from a robot, but you can't always get creativity, and so those are two. So what we find is that there are gaps on both sides, and what we really need to really to get both maximum performance, creativity, innovative ideas and solution based on a wealth of experiences you need both together.

Speaker 2:

I always say it's augmentation over automation, and when you have an organization who's simply pursuing automation, that's different. And then I tell people I say mechanization and digitization are not the same thing. I can put a machine in and I can have it do just about anything I want. But the information that is spent off from that machine, is it at a level that can be consumed by the decision maker, so that they feel more comfortable and confident in making a decision at that point in time? Because that's what digitization gives us. It goes beyond just mechanization.

Speaker 1:

You know you got your evangelical background in here. When I'm hearing augmentation over automation. I know Jesse Jackson's going to jump out of the screen at any moment. I know Jesse Jackson's going to jump out of the screen at any moment. That's why I like so much about talking with you. So you know, you've got an optimistic view of the future. How can we have done that? So free throw. What would you like to talk about that? I haven't asked you.

Speaker 2:

You know one thing and it was probably going along the lines of where you were going to go which is how do we get people interested in roles in healthcare? And so I'll take that and then add this piece in from the free throw, which is this we have to realize that individuals aren't always choosing you because you're state of the art. They're choosing you because you have really cool problems that they get to work on, and I found this what an employee hates most is to be bamboozled. They don't want to be lulled into a situation where you sold them a bill of goods telling them that this is what you are and that you're the next iteration of the greatest organization on the face of the earth. And then they get there and realize you're not even using Excel. You're still doing everything on pencil and paper. What they'd much rather be told we're still doing everything on pencil and paper, but we believe you can help us evolve into being a truly digital organization.

Speaker 2:

And what we find is I forget the study number, but there was a study that showed this stark comparison between organizations who are digitized and their ability to attract workforce, and really what it was about. It wasn't that they already had reached that. It's that they were working towards that and they were in a much greater position in terms of attracting, hiring and holding on to the talent that they brought into the organization, versus those who were not even interested in trying to evolve. So my thing is this you don't have to be the best to get the best. You just have to desire to work towards the best and the best, are willing to help you get there.

Speaker 1:

Those are excellent points. Another thing that I think is important too is I came out of the military and went to college at the University of Miami, started working at Baptist Hospital in Miami and it was because they nurtured me that I stayed in healthcare, and I wonder if organizations out there today understand the concept of nurturing their workforce like the folks that nurtured me, nurtured me back in 1969, you know.

Speaker 2:

Yeah, yeah, that's a good point, fred. We talk about the customer journey, we talk about the supplier journey, and I think it was about a month ago I was on a panel in Atlanta and I said but what's your employee's journey? How do they get to you, where do they want to go and at what point do you demonstrate you care about what's important to them? I often say the difference between training and development. Training is what you give me to help you, development is what you give me to help me, and oftentimes we're not developing people, we're not developing leaders. We're training people to do something for the organization and there's not a reciprocity there.

Speaker 1:

What a great way to sum up what we've talked about. Uh, randy, I really appreciate having you here, dr bradley, and I think we need to do another one of these things because someone the other day, amy watson, who works with us yes, we were talking about having you come in and speak with our folks and she said well, you can't let him be alone with fred because those two guys talk forever. So I I I took that as a compliment it is a compliment but before we go, before we go, I have one question.

Speaker 1:

I'm not going to ask you why your school couldn't pick a single uh name for itself, like the like the plainsmen or the war eagles, or the uh or the tigers, but I want you to answer this question for people out there who may not know this who is the greater athlete? The better athlete, is it Vince Jackson or is it Michael Jordan?

Speaker 2:

No doubt it's Vincent Bo Jackson.

Speaker 1:

There you go, thank you, I agree with that, and when I say it I get pushed back. But anyway.

Speaker 2:

Well, that's because they just see, michaelordan has been one of the greatest basketball players, but he didn't fare too well in baseball, but yeah 205 a double a right. Hey, you got a guy who played in the major league, played in the all-star game and then also was a pro bowler on the football field, was a world-class track star as well. So this guy was truly a world-class athlete.

Speaker 1:

He ran a 4.260 on his weighted track practice and only ran it once. He kept running right through the time Just kept running. Okay, dr Bradley. Thank you so much, and everybody out there. Please tune in next time for another episode of Taking the Supply Chain Pulse. Thanks again, randy, see you, it's been a pleasure, take care.

Evolution of Healthcare Supply Chains
AI, Corporate Brain Drain, and Greed
Ethical Challenges in AI and Workforce
The Impact of AI and Plagiarism