Search
Close this search box.

Why Critical Thinking is Crucial for AI [AI Today Podcast]

The widespread adoption and use of generative AI means that folks no longer need to be an expert in “hard” skills such as statistics & probability, calculus, or linear algebra to get value from using Generative AI. Instead, the need to use soft skills such as communication, curiosity, problem solving, and adaptability is becoming more …

Why Critical Thinking is Crucial for AI [AI Today Podcast] Read More »

Prompt Engineering Best Practices: Soft Skills [AI Today Podcast]

Generative AI is one of the most accessible forms of AI currently available. While in the past, you might have used AI without knowing it, you can use Generative AI purposefully in ways that have immediate and dramatic impact on your daily life. In this episode of AI Today hosts Kathleen Walch and Ron Schmelzer …

Prompt Engineering Best Practices: Soft Skills [AI Today Podcast] Read More »

Prompt Engineering Best Practices: Hack and Track [AI Today Podcast]

Experimenting, testing, and refining your prompts are essential. The journey to crafting the perfect prompt often involves trying various strategies to discover what works best for your specific needs. A best practice is to constantly experiment, practice, and try new things using an approach called “hack and track”. This is where you use a spreadsheet …

Prompt Engineering Best Practices: Hack and Track [AI Today Podcast] Read More »

Prompt Engineering Best Practices: Using Custom Instructions [AI Today Podcast]

As folks continue to use LLMs, best practices are emerging to help users get the most out of LLMs. OpenAI’s ChatGPT allows users to tailor responses to match their tone and desired output goals. Many have reported that using custom instructions results in much more accurate, precise, consistent, and predictable results. But why would you …

Prompt Engineering Best Practices: Using Custom Instructions [AI Today Podcast] Read More »

Prompt Engineering Best Practices: What is Prompt Chaining? [AI Today Podcast]

To improve the reliability and performance of LLMs, sometimes you need to break large tasks/prompts into sub-tasks. Prompt chaining is when a task is split into sub-tasks with the idea to create a chain of prompt operations. Prompt chaining is useful if the LLM is struggling to complete your larger complex task in one step. In …

Prompt Engineering Best Practices: What is Prompt Chaining? [AI Today Podcast] Read More »

Prompt Engineering Best Practices: Using a Prompt Pattern [AI Today Podcast]

LLMs are basically big “text predictors” that try to generate outputs based on what it expects is the most likely desired output based on what the user provides as the text-based input, the prompt. Prompts are natural language instructions for an LLM provided by a human so that it will deliver the desired results you’re …

Prompt Engineering Best Practices: Using a Prompt Pattern [AI Today Podcast] Read More »

Login Or Register

cropped-CogHeadLogo.png

Register to View Event

cropped-CogHeadLogo.png

Get The Prompt Engineering Best Practices: Using a Prompt Pattern [AI Today Podcast]

cropped-CogHeadLogo.png

AI Best Practices

Get the Step By Step Checklist for AI Projects

login

Login to register for events. Don’t have an account? Just register for an event and an account will be created for you!