Close this search box.

Prompt Engineering Best Practices: Using Plugins [AI Today Podcast]


Plugins for Large Language Models (LLMs) are additional tools or extensions that enhance the LLM’s capabilities beyond its base functions. In this episode hosts Kathleen Walch and Ron Schmelzer discuss this topic in greater detail.

Can I use plugins with ChatGPT?

Plugins can access external databases, perform specific computations, or interact with other software and APIs to fetch real-time data, execute code, and more. In essence, they significantly expand the utility of LLMs, making them more versatile and effective tools for a wide range of applications. They bridge the gap between the static knowledge of a trained model and the dynamic, ever-changing information and capabilities of the external world. Plugins can be used on many different LLMs.

Why use plugins?

People use plugins for a variety of reasons. They allow you access to Real-time information by accessing up-to-date information from the web or other data sources,. They can also an perform specialized tasks like solving complex mathematical problems, generating code, or providing translations with expertise that might not be fully developed in the base model. Plugins also enable LLMs to interact with other applications and services, allowing for dynamic content generation, automation of tasks, and enhanced user interactions. They also allow for customization and personalization as well as improved performance and efficiency. In the episode we discuss this all in greater detail.

Show Notes:

Login Or Register


Register to View Event


Get The Prompt Engineering Best Practices: Using Plugins [AI Today Podcast]


AI Best Practices

Get the Step By Step Checklist for AI Projects


Login to register for events. Don’t have an account? Just register for an event and an account will be created for you!