Episode 3  |  48 Min  |  February 05

Leading the AI transformation of your company with Prof. Gregory LaBlanc

Share on

Engaging topics at a glance

  • 00:13:40
    What is transformation? What constitutes it?
  • 00:15:29
    Have you seen unpredictable organizational behavior before?
  • 00:16:30
    Learnings that enterprise leaders should pay attention to
  • 00:17:30
    How do organizations overcome fear to adapt?
  • 00:18:55
    Do you foresee AI running parts of companies?
  • 00:21:28
    Is data accessibility a key challenge for AI?
  • 00:23:29
    Are algorithms or data the true competitive edge?
  • 00:25:17
    Will companies without data become irrelevant?
  • 00:30:28
    What is your vision for the future of work?
  • 00:36:53
    Will AI drive higher-order thinking?

"AI Transformation – the new paradigm" with UC Berkeley Professor and AI Startup Expert, Greg La Blanc. Get ready to dive into the future of AI!

For some people, transformation is exciting and challenging. Curiosity and excitement about learning, drew Greg to into the field of strategy and transformation and all the other topics that he has been teaching throughout his career. 

Every time you learn something, you are displacing or changing some previous notion of how the world works. For some people, this is disturbing. But for others, it is a thrill and really exciting. It’s how you approach the transformation is the beginning of how you deal with transformation, and curiosity is such a powerful, such a powerful human trait.

Technology diffuses rapidly. What does not diffuse rapidly are these managerial techniques, organizational architectural innovations. This is why these older companies have difficult time adapting.

– Gregory LaBlanc

Some people would emphasize what they call long-term trends. And then others would be more inclined to say everything’s new. Similarly, with the digital and AI transformations taking place, you can say, everything’s new, everything has to be changed. This is something that we’ve never seen before, or you can say this is not that much different from the sorts of things that we have seen and happened to us in the past. 

As humans, we are in the entropy reduction business. We are trying to create order. We’re trying to make sense of our world. We’re trying to put in place practices that we can automate. We’re trying to create routines and subroutines, and indeed, this is how efficiency happens. Efficiency happens when you realize, you start to recognize patterns, and you start to engage in repetitive action. 

The problem with that is that the circumstances and the environment changes. And so, the routines that you’ve established, they need to be changed at some point. And that requires a bit of work. So, sometimes, there’s a couple different ways we can respond to that. One is to say, okay, the world’s changed, so we got to change the way we’re doing things. The other is to say, well, let’s try to change the world so that we don’t have to change. And that often means trying to shape the behavior of your customers or your employees or try to use regulation or market power to hold off the onslaught of change.

The third way is to say, let’s change. 

Too much flexibility means that nothing ever gels, too little flexibility means that, you get stuck. And so, it is needed to figure out what that optimal amount of flexibility is, and then figuring out a way to routinize change. That sounds paradoxical. It means creating systems, which are designed right intentionally to respond to the, the changing environment. If you can routinize change, you can routinize curiosity. If you can create a standard operating procedure for discovery, then in some ways you can have your cake and eat it too. And that’s what all really good dynamic businesses are, are trying to do.

Every time there’s a new discovery in the world of artificial intelligence, people say, now’s the time. This is AI, it’s this. Back in 2015 with neural nets, everyone’s like, yes, AI finally. The possibilities of AI and each one of these sorts of punctuated discoveries are a continuation of series of discoveries that have been happening right in the world of artificial intelligence for the last couple of decades.

Every time there is a new discovery in the world of artificial intelligence, people say – now is the time. This is AI. This is going to change everything.

– Gregory LaBlanc

The technology diffuses rapidly. What doesn’t diffuse as rapidly are managerial techniques, organizational, architectural innovations. And that’s also the reason why older companies have a tough time adapting. They resist change and the kinds of transformations that they would need to undertake in order to enable new technologies.

There is the immune system of the organization, but the immune system of all of the individuals within the organization Natural propensity for many people is to fight new ideas when they encounter them as individuals. And then if you take that and you combine it into a big organization, you can often have an organization where every individual’s open to new ideas, but the organization is not because it has its own logic.

Fear plays a role, but it’s not the complete story. It’s not always that they’re afraid. They feel fairly confident that they can keep this at bay. And this is why leadership is so critical. You need carrots and sticks, but you also need your, your, your vision and, and your messaging.

Even before generative ai, more primitive forms of machine learning and the ones that have been the easiest to adopt are the ones that perform some relatively narrow tasks. Suppose you are in HR and you’re doing hiring, and someone comes up with a product that helps you to process more applications more quickly. You can see how that is going to save you money. You can see if you are in marketing and someone comes along and says, I got this great tool that’ll help you to figure out who you should be targeting with your marketing. You will think, I am a revenue center, I’ve just boosted my revenue. So, all of those specific applications are actually relatively unproblematic. 

Just setting aside AI for a second, if we look at the automotive industry. Look at a company like Ford or GM that has tier one suppliers, tier two suppliers, tier three suppliers, and son on. If there is an innovation in the steering column, the tier one supplier makes steering, they’ll figure it out and they’ll start selling it. But the challenge is when you want to figure out a way to connect those things.

The current supply chain architecture makes it very difficult, because you need to adjust the design elements of the brake to coordinate better with the design elements of the, the steering column. And when you have everything set up in this, then it becomes tough. Whereas with Tesla, which has an integrated, much more integrated production process and design process, it is super easy. To make those kinds of shifts. So, the reason the car companies are struggling is because they’ve tried to incorporate a lot of these new technological innovations into the pre-existing business architecture, supply chain, and value chain architecture, which was optimized for the internal combustion engine. Which is why someone like Tesla can just leapfrog.

Your competitive advantage is always going to come from the data. It is never going to come from your analytics tools. 

If you don’t have the right data to train the model on, the model is just a computer with no apps on it.

– Gregory LaBlanc

If I have access to unique data, then I can take cutting edge algorithms and train them on that data it can give competitive edge.

There will be companies that can they live without a solid data strategy, but for the vast majority of companies, if you do not have a data strategy, you’re toast.

There are two major takeaways. The first one is in this transformation; your organizational structure is super important. How you organize your company so that data is democratised. And then the second one is having high quality unique data. Not just the quality of data, it is the uniqueness of the data is what’s going to differentiate you going forward, at least in the next couple of years.

How do you make a balance between flexibility and order is also going to be an important skill for all leaders. All our education systems have to teach flexibility, adaptability, how to learn and how to learn fast.

With artificial intelligence in all of our jobs, we have to develop higher order thinking skills.

Production Team
Arvind Ravishunkar, Ankit Pandey, Chandan Jha

Latest podcasts

Episode 10  |  61 Min  |  February 05

What you should know about LLM’s with Anupam Datta, Co-founder TruEra, and ex-CMU

What you should know about LLM’s with Anupam Datta, Co-founder TruEra, and ex-CMU

Share on

Engaging topics at a glance

  • 00:09:15
  • 00:13:40
    What is a Large Language Model (LLM)?
  • 00:18:40
    Is LLM a form of intelligence?
  • 00:20:25
    Comparing how LLMs learn than human learning.
  • 00:22:50
    How LLMs differ from one another?
  • 00:27:56
    What to consider when choosing LLMs?
  • 00:44:05
    Can LLMs retrieve past human knowledge?
  • 00:51:45
    How can companies harness power of statistical models?
  • 00:53:05
    Key things to keep in Mind when integrating LLM into the business.
  • 00:56:10

Join us in this episode featuring Anupam Datta, Co-founder and Chief Scientist, TruEra, as we dive into the evolution of LLMs and what they hold for the future!

This world of generative AI has caught us by storm. And as enterprise leaders in your companies, understanding the technology behind generative AI will give you a competitive advantage as you plan your companies and businesses. And to help you do this, we will unpack a technology, large language models (LLMs), that powers AI today and represent a paradigm shift in this field of Artificial Intelligence.

LLMs can craft meaningful responses across many domains. Their performance has notably improved recently thanks to the substantial increase in model size and data volume.

With the increasing acceptance of this technology, numerous companies are unveiling various Large Language Models (LLMs). It's important to recognize that opting for the largest or highest-performing LLM isn't always the most suitable approach. Instead, one might prefer LLMs that excel in specific tasks relevant to their application. As a leader in the enterprise, it's crucial to integrate this understanding into your company's strategy, aiding in identifying the appropriate LLMs to match and adapt for your applications. Achieving equilibrium between LLM selection, cost considerations, and latency considerations stands as a pivotal concern for enterprises. Equally essential is the thorough validation and assessment of generative outputs, serving as a safeguard prior to embarking on consequential choices. Hence, the undertaking of reliability testing at this current juncture is paramount.

Furthermore, enterprises need to consider a few other key aspects in this evolving landscape of LLMs as they build out LLMs. Starting with a well-defined business use case that offers real value is crucial. As LLMs move from development to production, it's important to establish thorough evaluations and observability throughout their lifecycle. Education across the organization is vital to implement LLMs effectively. Companies should train their workforce to adapt to this changing technology stack. Fostering a community around responsible AI development and evaluation can contribute to a better understanding and management of LLMs. With these steps, enterprises can navigate the complexities of LLMs and harness their potential for positive impact.

Production Team
Arvind Ravishunkar, Ankit Pandey, Chandan Jha

Top trending insights

Episode 12  |  56 Min  |  February 05

Uncovering GenAI tools and infrastructure with Rajat Monga, Co-Founder, TensorFlow

Uncovering GenAI tools and infrastructure with Rajat Monga, Co-Founder, TensorFlow

Share on

Engaging topics at a glance

  • 00:16:10
    Google Brain and TensorFlow
  • 00:19:10
    TensorFlow for AI world
  • 00:21:55
    Tooling and infrastructure needs of previous AI models vs. GenAI
  • 00:25:40
    Trade-offs, Open-source library and framework vs. private company's framework
  • 00:31:30
    Present quality of tool and Infrastructure available for GenAI
  • 00:32:55
    How do I build a GenAI team?
  • 00:37:50
    Vendors and the Infrastructure available today to take GenAI into production
  • 00:41:51
    How to differentiate in the GenAI world?

Join us in exploring the evolving space of GenAI tools and infrastructure, featuring Rajat Monga, Co-Founder of TensorFlow and Google Brain.

A good power tool can make the difference of easily six to seven hours of work when you're doing woodworking. And the world of AI is no different. The types of tools and infrastructure that are developed to help you as enterprise leaders build artificial intelligence products and features are very, very important. In this episode, with our guest's help, we will unpack the infrastructure that surrounds you and the tooling that will help you as enterprise leaders build AI products and services.

We will look at how tooling and infrastructure needs are changing in the world of AI with the increasing adoption of GenAI. One key change that emerged from the talk is that now things have evolved to the extent that we don’t need to train the models from scratch. We already have foundation models available that know our world up to some level, reducing the burden of training the model with tonnes of data. However, models today have become so large that they sometimes don’t fit in a single machine.

As connecting your database with these models is very important, we also discussed the trade-offs between the open-source and private libraries. Should companies manage data on their own or outsource it? When you are not training your model, the easiest and fastest way is to use API, and if you want your data on-prem, then it will mostly cost you more. In the end, it boils down to what core to you is, and often, not all part of the infrastructure is core to companies. So, if your data is not your core strength, then better outsource it.

This episode also uncovered the current tools and infrastructure available for GenAI. The current tools and infrastructure available for large-level deployments are going through a rapid evolution; they are not very hard to rebuild or replace, and new companies are emerging in the tooling and infrastructure for GenAI space.

When looking at the talent and skills needed for GenAI implementation in your organization, it is important to have technically sound people with domain expertise in the organization's particular area.

For the differentiation in the market domain knowledge in your area, relationship with the customers, distribution channel, your execution, etc., today plays an even bigger role. However, in this data-driven world, having proprietary data and knowing how to leverage it can be an added advantage. To find out more, tune in to the full podcast.

Production Team
Arvind Ravishunkar, Ankit Pandey, Chandan Jha

Co-create for collective wisdom

This is your invitation to become an integral part of our Think Tank community. Co-create with us to bring diverse perspectives and enrich our pool of collective wisdom. Your insights could be the spark that ignites transformative conversations.

Learn More