Understanding Prompt Engineering Basics

Learn about the fundamentals of prompt engineering and how to combine different elements of prompts for effective communication with Llama 3.

Think of LLMs as skilled chefs capable of making amazing dishes with many ingredients. However, just like chefs need a clear recipe to make a dish, language models need well-crafted prompts to produce the desired output.

Prompt engineering is the art of writing a prompt, choosing the right words to guide the model’s response, and adjusting them until you get exactly what you want.

Press + to interact

Let's explore prompt engineering in more detail and discover different ways to write an effective prompt.

What is prompt engineering?

A prompt is the instructions given to an LLM to harness its full potential and generate a response. Prompt engineering takes it a step further by carefully crafting and refining these prompts to guide the LLM towards producing the exact output we need.

Writing prompts with clear objectives, goals, and requirements helps the model understand what users want, leading to more relevant and accurate responses. We can transform a simple query into a well-crafted prompt that guides the AI model to understand and generate better responses using various elements of prompts. Think of elements as building blocks that are combined to form an effective prompt.

Disclaimer: In this course on Llama 3, we aim to be respectful and inclusive of all gender identities. We've tried to use gender-neutral language and avoid content that excludes or discriminates against anyone. However, in certain scenarios, like when explaining writing prompts and related concepts, we use male and female gender pronouns (he/she). This decision is made for clarity and practical illustration purposes only. Please understand that using specific gender pronouns in this context is not intended to exclude or marginalize any individual or group.

Elements of prompts

The fundamental elements of prompts are listed below:

  • Instruction

  • Context

  • Constraints

  • Format

  • Variables

Let's explore how to combine these elements to refine our prompt for desirable results.

Instruction

The instruction is the most basic element of a prompt, providing clear and concise guidelines for the model to follow. It directs the model's action, ensuring it performs the desired task with precision and accuracy. The instruction should be straightforward, unambiguous, concise, and focused, allowing the model to generate an accurate response.

An instruction can be of various forms, such as:

  • Question: "How many months are there in a year?"

  • Task-oriented statement: "Summarize the following paragraph from Wikipedia..."

  • A directive to generate content: "Write a poem about a boy who..."

Press + to interact
Types of instruction prompts
Types of instruction prompts

Consider a scenario where an employee, John, arrives at the office on a Monday morning, and his manager informs him that he needs to travel to Dubai for an urgent business meeting with a client. John has never traveled before, so he quickly contacts a travel agency to help him plan the trip. The travel agent, Olivia, decides to use the Llama 3 model to create a packing list for the trip. Let's see how the model responds.

Prompt: Create a packing list for a trip

That's perfect. The model successfully created a packing list for John.

Does John really need all of these things for a two-day business trip?

Context

Context is any additional information or relevant background related to the query that is necessary for the model to generate an accurate response. Providing context helps the model respond to queries according to the provided information, guiding it in the right direction to generate a relevant response.

Press + to interact

Let's refine the prompt by providing more details about the trip, such as the destination, duration, and type of trip. In doing so, we help Llama 3 generate a more specific packing list that meets John's specific requirements.

Prompt: Create a packing list for a two-day business trip to Dubai

As we can see now, the model generated a more specific and relevant packing list by providing context, such as the duration, type of trip, and destination.

Constraints

LLMs are powerful machine learning models that can generate unlimited content if they are not properly guided. To overcome this issue, we apply constraints to our prompts.

Constraints are the limitations or restrictions that need to be implemented on the response generated by the model. They specify the scope of the response, such as what to include and what not to include, length, tone, audience, and much more. Constraints are also an important element of the prompt as they allow the model to generate a response within specific boundaries that meet the user’s requirements.

Let's try adding some constraints to our prompt, like the audience, weight limit, and the categories of items to pack, and see how the Llama 3 response changes now.

Prompt: Create a packing list for a two-day business trip to Dubai for a 40-year-old man. Consider that the luggage has a strict weight limit of 10kg, and we need to pack professional attire, toiletries, and work items.

We can see that the model now generated a response within our specified constraints.

Format

There are times when we need a response that is not only accurate but also organized in a specific order or structure. For example, we may need our response in bullets, tables, categories, or any other custom style. This is where the format element comes in.

The format is the structure or layout of our generated response. It is the instruction on organizing, styling, presenting, and meeting any other requirement of a generated response.

Press + to interact

The format helps us define the structure of output beforehand in the prompt. Without it, the model will generate random output that needs to be formatted later according to the requirement.

In the business trip scenario, the travel agent got the response from Llama 3, but this is not enough. She also needs to share the response with John. She can either write an email herself or specify the format of output (email) in her prompt, saving time and effort.

Prompt: Create a packing list for a two-day business trip to Dubai for a 40-year-old man. Consider that the luggage has a strict weight limit of 10 kg, and we need to pack professional attire, toiletries, and work items. Provide the response in the format of an email.

We can see Llama 3 model successfully generated a response in an email format. The travel agent can simply replace the receiver and sender names and send this email to John.

Variables

We can further tweak the prompt to make it more efficient. But before we see how try answering the following questions:

  1. Can the travel agent make Llama 3 write a complete name, removing the minimal effort of writing names?

  2. How can the travel agent efficiently send such emails to multiple customers?

Think about it!

Well, a simple solution that comes to mind is mentioning the names in the prompt and writing a new prompt for each customer according to their trip details, but this would require a lot of effort, right?

How about automating this task? The travel agent can save time and effort by writing a template prompt with variables for trip details. Now, she just needs to change the variable details according to the trip, and the prompt is ready for Llama 3.

Variables involve using placeholders or flags to customize the prompt. We can specify the variables in our prompt using the {{variable_name}}. This is the standard and most widely used format for writing variables, but there are some other formats to provide variables, which include:

  • Curly braces: {variable_name}

  • Square brackets: [variable_name]

  • Parentheses: (variable_name)

  • Dollar sign: $variable_name

  • Percent signs: %variable_name%

Let's try writing a template prompt using variables.

Prompt: {{Receiver}} = John

{{Sender}} = Olivia

{{Days}} = 2 days

{{TripType}} = business trip

{{Destination}} = Dubai

{{Weight}} = 10 kg


Write an email from {{Sender}} to {{Receiver}} with a packing list for a {{Days}} {{TripType}} to {{Destination}}. Consider that the luggage has a strict weight limit of {{Weight}}.

Provide the response in the format of an email.

Now, sending an email is very easy for the travel agent. She will just update the variables with the trip details of the new customer and Llama 3 will generate an email with customized packing essentials for the customer.

Note: Finding the perfect prompt can be a challenge. It often requires patience and persistence as we refine and apply many elements through multiple iterations to find one that generates the desired results.

Importance of prompt engineering

LLMs have revolutionized the AI landscape with their ability to understand and generate human-like language. The following are the key reasons that highlight the importance of prompt engineering in AI:

  • Accuracy and relevance: Well-designed prompts optimize interaction with LLMs, leading to more accurate and relevant responses, essential for applications such as text summarization, question-answering, and more.

  • Performing complex tasks: Prompt engineering enables AI models to perform complex tasks and generate accurate responses, significantly expanding the capabilities of language models.

  • User experience: Effective prompts enable users to access their required information more quickly and easily, enhancing the overall user experience with LLMs and making them more accessible and beneficial for a diverse range of users.

  • Ethical considerations: Prompt engineering plays a vital role in ensuring that language models generate responsible and ethical responses that are aligned with human values and carefully avoid biases.

Prompt engineering is a vital component of AI development. As AI continues to transform industries and daily life, mastering prompt engineering becomes a key skill to maximize its potential.

Try it yourself

Ask your prompts in the coding playground below and see its response.

Scenario: You're a shopping assistant at a fashion store. A customer with a budget of $100 comes in looking for a formal black outfit for a wedding. Write a prompt using prompt engineering elements to help the customer find the perfect outfit.

Note: The responses in the coding playground below may vary with the Meta AI web app since we are using a Together AI instruction-tunedInstruction tuning is the process of training LLMs on instruction and output pair dataset. version of Llama 3 model from the Together AI platform.

> Press the "Run" button to chat with Llama 3 model.
Llama 3 playground