Introduction
With the intro of Giant Language Fashions, the utilization of those LLMs in numerous functions has tremendously elevated. In a lot of the latest functions developed throughout many downside statements, LLMs are a part of it. A lot of the NLP house, together with Chatbots, Sentiment Evaluation, Matter Modelling, and lots of extra, is being dealt with by Giant Language Fashions. Working instantly with these LLMs can usually be troublesome, resulting in totally different instruments like LangChain and LlamaIndex, which assist simplify creating functions with Giant Language Fashions. On this information, we are going to have a look at one such device known as Phidata, which simplifies constructing real-world functions with LLMs.
Studying Targets
- Perceive the fundamentals and goal of Phidata for constructing LLM functions
- Discover ways to set up Phidata and its dependencies
- Acquire proficiency in creating and configuring LLM assistants utilizing Phidata
- Discover integrating numerous instruments with LLMs to reinforce their capabilities
- Develop expertise to create structured outputs utilizing Pydantic fashions inside Phidata

This text was revealed as part of the Information Science Blogathon.
What’s Phidata?
Phidata is a well-liked Python library constructed to create real-world functions with Giant Language Fashions by integrating them with Reminiscence, Data, and Instruments. With Phidata, one can add reminiscence to an LLM, which might embrace Chat historical past, or we are able to mix the LLM with a Data Base, that are the vector shops, with which we are able to construct RAG(Retrieval Augmented Technology) programs. Lastly, LLMs may even be paired with Instruments, which permit the LLMs to carry out duties which can be past their capabilities
Phidata is built-in with totally different Giant Languages Fashions just like the OpenAI, Groq, and Gemini and even helps the open-source Giant Language Fashions by Ollama. Equally, it helps the OpenAI and Gemini embedding fashions and different open-source fashions by Ollama. Presently, Phidata helps a number of common vector shops like Pinecone, PGVector, Qdrant, and LanceDB. Phidata even integrates with common LLM libraries like Langchain and LlamaIndex
Getting Began with Phidata Assistants
On this part, we are going to see easy methods to obtain the Phidata Python library and begin with it. Together with Phidata, we are going to want another Python libraries, together with Groq, Duckduckgo Search, and OpenAI. For this, we run the next code:
!pip set up -q -U phidata groq==0.7 duckduckgo-search openai
Code Rationalization
- Phidata: That is the library we are going to work with to create LLM Purposes
- groq: Groq is an organization identified for creating new GPU tools to run LLMs quicker. It’s known as LPU(Language Processing Unit). This library will allow us to entry the Mixtral mannequin that’s working on the Groq LPUs(Language Processing Unit)
- duckduckgo-search: With this library, we are able to search the web. The Giant Language Mannequin can leverage this library to carry out web searches
- openai: That is the official library from OpenAI to work with the newest GPT fashions
Setting Up API Keys
So, working this can obtain all of the Python modules that we wish. Earlier than beginning, we have to retailer the APIs within the atmosphere so the Phidata can entry them to work with the underlying fashions. For this, we do the next.
import os
os.environ['GROQ_API_KEY'] = GROQ_API
os.environ['OPENAI_API_KEY'] = OPENAI_API_KEY
To get a free API from Groq’s official web site. With this, we are able to entry the Mixtral Combination of Specialists mannequin. By default, Phidata works with the OpenAI mannequin, so we even give it the OpenAI API Key. So, working the code will save the Groq API and the OpenAI API Keys to the atmosphere. So allow us to get began with the Phidata library
from phi.assistant import Assistant
assistant = Assistant(
description="You're a useful AI Assistant who solutions each consumer queries",
)
assistant.print_response("In brief, clarify blackholes to a 7 yr previous?", markdown=True)
Code Rationalization
- We begin by importing the Assistant class from the phi.assistant
- Then, we instantiate an Assistant by offering an outline. An assistant is a Giant Language Mannequin. To this LLM, we are able to present an outline, System Prompts, and different LLM configurations to outline an assistant
- We are able to name the LLM by calling the .print_response() technique. To this, we cross the consumer question and even present one other parameter, markdown=True
Operating the code will outline an Assistant with OpenAI GPT for the LLM, after which a consumer question might be given to this Giant Language Mannequin, and at last, the output might be generated. The output generated might be formatted neatly. By default, the output might be generated in a streaming format. We are able to see the output beneath

We see the response generated in a well-formatted type. Right here, we even see the consumer query and the reply generated by the Giant Language Mannequin. Setting the markdown = True is the rationale we see the output being printed in a readable format. Now, if we want to change the kind of OpenAI mannequin that we need to work with, we are able to test the beneath code
Altering the OpenAI Mannequin
Right here’s the code:
from phi.assistant import Assistant
from phi.llm.openai import OpenAIChat
assistant = Assistant(
llm=OpenAIChat(mannequin="gpt-3.5-turbo"),
description="You assist folks with offering helpful solutions to their queries",
directions=["List should contain only 5"],
max_tokens = 512
)
assistant.print_response("Listing a few of high cricketers on the earth", markdown=True)

Code Rationalization
- We begin by importing the Assistant class from the phi.assistant, we even import the OpenAIChat from the phi.llm.openai
- Now, we once more create an Assistant object. To this, we give a parameter known as llm; to this parameter, we offer the OpenAIChat() class with the mannequin given. Right here it’s GPT 3.5
- Right here, we even give some extra parameters just like the directions, to which we cross the directions that need the LLM to observe, and even the max_tokens, to restrict the era of the LLM
- Lastly, we name the .print_response() by giving the consumer question to it
Operating the code will generate an output, which we are able to see beneath. We see that the OpenAI mannequin has certainly adopted the directions that we gave it. It produced solely 5 components within the record, which aligns with the Directions we gave it. Aside from OpenAI, Phidata is built-in with giant language fashions. Allow us to attempt working with the Groq mannequin with the next code
Working With Groq Mannequin
Right here’s the code:
from phi.assistant import Assistant
from phi.llm.groq import Groq
assistant = Assistant(
llm=Groq(mannequin="mixtral-8x7b-32768"),
description="You assist folks with offering helpful solutions to their queries",
max_tokens = 512
)
response = assistant.run('How a lot peak can a human fly?', stream=False)
print(response)

Code Rationalization
- We begin by importing the Assistant class from the phi.assistant, we even import the Groq from the phi.llm.groq
- Now, we once more create an Assistant object. To this, we give a parameter known as llm, and to this parameter, we offer the Groq() class with the mannequin, that’s, Mixtral Combination of Specialists
- We even give description and max_tokens whereas creating the Assistant object
- Right here, as an alternative of the .print_response(), we name the .run() operate, which doesn’t print the output within the markdown format however in a string format
- We even set the Stream to False right here so we are able to see the whole output in a single go
Operating the output will produce the output that we are able to see above. The response generated by the Mixtral-8x7B appears to be like good. It may be seen that the consumer question is unnecessary and supplies the appropriate reply to the question. Typically, we’d like the output generated by the LLM to observe a construction. That’s, we wish the output generated by the LLM in a structured format so it turns into straightforward to take this output and do processing for future code. Phidata simply helps this, and we are able to accomplish that with the next instance
Let’s Create a Journey Itinerary Utilizing Phidata
Right here’s the code:
from typing import Listing
from pydantic import BaseModel, Subject
from phi.assistant import Assistant
from phi.llm.groq import Groq
class TravelItinerary(BaseModel):
vacation spot: str = Subject(..., description="The vacation spot of the journey.")
period: int = Subject(..., description="The period of the journey in days.")
travel_dates: Listing[str] = Subject(..., description="Listing of journey dates in YYYY-MM-DD format.")
actions: Listing[str] = Subject(..., description="Deliberate actions for the journey.")
lodging: str = Subject(..., description="Lodging particulars for the journey.")
price range: float = Subject(..., description="Estimated price range for the journey.")
travel_tips: str = Subject(..., description="Helpful journey suggestions for the vacation spot.")
travel_assistant = Assistant(
llm=Groq(mannequin="mixtral-8x7b-32768"),
description="You assist folks plan journey itineraries.",
output_model=TravelItinerary,
)
print(travel_assistant.run("New York"))
Code Rationalization
- We begin with importing a number of libraries, which embrace the Pydantic, typing, and our Assistant
- Then, we begin by defining our Structured Output. For this, we create a category; in our instance, we create a TravelItinenary class, which inherits from the Pydantic BaseModel
- On this class, we offer totally different variables, which embrace totally different travel-related info
- For every variable, we offer the knowledge kind that the variable shops, and within the description, we write what it’s anticipated to retailer by the Subject object from Pydantic
- Lastly, we create a journey assistant object by calling the Assistant Class and giving it all of the parameters it wants, which embrace the LLM, description, and the output_model parameter, which takes in our Pydantic object
Now, we name the .run() operate of the assistant and provides it a location title to get the Journey Itinerary. Operating this can produce the beneath output

We are able to see that the output generated from the assistant is a Pydantic Object, i.e., the TravelItinenary class that we now have outlined. Every variable on this class is stuffed up in keeping with the descriptions given whereas creating the category. The Mixtral MoE 8x7B has finished an excellent job in filling the TravelItinenary object with the appropriate values. It has supplied us with the journey itinerary, journey dates, period of keep, and record of actions to carry out at that location. Together with that, it even supplies journey suggestions to assist lower your expenses.
Giant Language Fashions by themselves are very a lot restricted when it comes to usability. LLMs can solely generate textual content. However there are circumstances, the place one desires to get the newest information or desires to get some info which requires an API name. In these eventualities, we are going to want instruments.
Instruments are LLM weapons. LLMs can work with instruments to carry out actions which can be unattainable to do with vanilla LLMs. These instruments will be API instruments, the place LLMs can name these instruments to carry out API calls and fetch info, or these will be math instruments, the place LLMs can work with them to carry out math operations.
Constructed-in Instruments in Phidata
Phidata already has some in-built instruments, which the LLMs can work with. Allow us to do that with the next code:
from phi.assistant import Assistant
from phi.instruments.duckduckgo import DuckDuckGo
assistant = Assistant(
llm=Groq(mannequin="mixtral-8x7b-32768"),
instruments=[DuckDuckGo()],
show_tool_calls=True,
description="You're a senior BBC researcher writing an article on a subject.",
max_tokens=1024
)
assistant.print_response("What's the lastest LLM from Google?", markdown=True, stream = False)

Code Rationalization
- We begin by importing the Assistant class from the phi.assistant, and for the device, we import the DuckDuckGo device from the phi.instruments.duckduckgo
- Subsequent, we instantiate an Assistant object by calling the Assistant class with totally different parameters. These embrace the LLM parameter, the place we offer the Groq mannequin and the outline and the max_tokens parameters
- Right here, we even present another parameters like instruments, to which we offer an inventory of instruments; right here, we offer an inventory with a single ingredient DuckDuckGo
- To test if the Assistant has known as the device or not, we may give one other parameter known as show_tool_calls=True, which can show the device name if known as
- Lastly, we known as the .print_response() operate and gave a consumer question asking for details about the newest Google LLMs
Operating this code will create an assistant with the DuckDuckGo device. After we name the LLM with the consumer question “What’s the Newest LLM from Google?” the LLM will resolve whether or not to make a operate name. The output image reveals that the LLM has made a operate name to the DuckDuckGo search device with the question “newest LLM from Google.”
The outcomes fetched from this operate name are fed again to the mannequin together with the unique question in order that the mannequin can work with this knowledge and generate the ultimate response for the consumer question, which it did for our question. It has generated the appropriate response in regards to the newest LLMs from Google: the Gemini 1.5 Flash and PaliGemma, which the Google Crew not too long ago introduced. We are able to additionally create our constructed instruments apart from counting on the Phidata instruments. One instance of this may be seen beneath
Making a Customized Software
Right here’s the code:
from phi.assistant import Assistant
def energy(Base: float, Exponent: float) -> str:
"Increase Base to the Exponent energy"
return str(Base**Exponent)
assistant = Assistant(
llm=Groq(mannequin="mixtral-8x7b-32768"),
instruments=[power],
show_tool_calls=True)
assistant.print_response("What's 13 to the facility 9.731?", stream = False)

Code Rationalization
- Right here, we begin by importing the Assistant class from the Phidata library
- Then we outline a operate known as energy(), which takes in two floating level numbers known as Base and Exponent after which returns the Base raised to the exponent energy
- Right here, we return the string as a result of Phidata instruments anticipate the output to be returned in a string format
- Then, we create an assistant object and provides this new device within the type of an inventory of the device parameters together with the opposite parameters
- Lastly, we name the assistant by giving it a math question associated to the operate
Operating this code has produced the next output. We are able to see within the output that the mannequin did carry out a operate name by calling the facility operate and giving it the right arguments taken from the consumer question. Lastly, it takes within the response generated by the operate, after which the assistant generates the ultimate response to the consumer’s query. We are able to even give a number of features to the LLM and let the assistant name these features a number of occasions. For this, allow us to check out the code beneath.
Verifying Software Invocation
Right here’s the code:
from phi.assistant import Assistant
def energy(Base: float, Exponent: float) -> str:
"Increase Base to the Exponent energy"
return str(Base**Exponent)
def divison(a: float, b: float) -> str:
"Divide a by b"
return str(a/b)
assistant = Assistant(
llm=Groq(mannequin="mixtral-8x7b-32768"),
instruments=[power, divison],
show_tool_calls=True)
assistant.print_response("What's 10 energy 2.83 divided by 7.3?", stream = False)
On this code, we now have outlined two features. The primary operate takes in two floats known as Base and Exponent and returns a string containing the Base raised to its exponent. As compared, the second operate is a division operate. Given two integers, a and b, it returns a string of a/b.
Then, we create an assistant object with Groq for the Giant Language Mannequin and provides these two instruments an inventory of the instruments’ parameters whereas creating the assistant object. Lastly, we offer a question associated to those instruments to test their invocation. We give “What’s 10 energy 2.83 divided by 7.3?”
From the output beneath, we are able to see that two instruments are getting known as. The primary is the facility device, which was known as with the suitable arguments. The reply from the facility instruments is given within the type of the argument and one other variable whereas calling the second device. Lastly, the reply generated by the second operate name is given to the LLM in order that the LLM can generate the ultimate response to the consumer question.

Phidata even lets us create a cli app with the Assistant. We are able to create such an utility with the next code.
Making a CLI Software
Right here’s the code:
from phi.assistant import Assistant
from phi.instruments.duckduckgo import DuckDuckGo
assistant = Assistant(instruments=[DuckDuckGo()],
show_tool_calls=True,
read_chat_history=True)
assistant.cli_app(markdown=True)
- We begin by importing the Assistant and Duckduckgo from the Phidata library
- Then, we create an assistant object by calling the Assistant class and giving it the LLM and the instruments that we want to work with
- We even set the read_chat_histoy to True, which can permit the mannequin to learn the chat historical past if wanted
- Lastly, we name the .cli_app() operate of the assistant object and set the markdown to True so to pressure the mannequin to supply a markdown response
Operating the above command will create a Terminal App the place we are able to chat with the Mixtral mannequin. The dialog will be seen within the beneath pic


Within the first reply, Mixtral known as the duckduckgo operate to get details about the newest fashions created by Google. Then, within the second dialog, we are able to discover that the Mixtral has invoked the get_chat_history operate, which was obligatory given the consumer question and was capable of reply the consumer question accurately
Constructing a Crew of Assistants
Making a crew of assistants is feasible by the Phidata library. It lets us create a crew of assistants who can work together with one another and delegate work to one another to carry out particular duties with the precise instruments assigned to every assistant. Phidata simplifies the method of making such assistants. We are able to create a easy assistant crew with the next code:
Right here’s the Code
from phi.assistant import Assistant
from phi.llm.groq import Groq
def energy(base: float, exponent: float) -> str:
"Increase base to the exponent energy"
return str(base**exponent)
math_assistant = Assistant(
llm=Groq(mannequin="mixtral-8x7b-32768"),
title="Math Assistant",
function="Performs mathematical operations like taking energy of two numbers",
instruments=[power],
)
main_assistant = Assistant(
llm=Groq(mannequin="mixtral-8x7b-32768"),
title="Analysis Crew",
show_tool_calls = True,
crew=[math_assistant],
)
main_assistant.print_response(
"What's 5 energy 9.12?",
markdown=True,
stream = False,
)
- The code begins by importing the required modules from the Phidata library: Assistant from phi.assistant, Groq from phi.llm.groq
- We then outline a operate named energy, which takes two float parameters (Base and Exponent) and returns the results of elevating the Base to the Exponent as a string
- An Occasion of Assistant named math_assistant is created with attributes just like the llm, the place we give the llm we need to work with, the title of the assistant, right here we title it the Math Assistant and the record of instruments the assistant can work with, we offer the facility device to the maths assistant and provides it a job attribute defining the function of the assistant
- Equally, we create a main_assistant, however right here we offer the crew attribute the place we give it an inventory of assistants to whom it could possibly delegate work; right here in our code, will probably be the math_assistant
- The print_response operate known as on main_assistant with a question (“What’s 5 energy 9.12?”), formatted to show in Markdown
So, working this has produced the beneath output:

So when the code runs, the main_assistant is fed with the consumer question. The question incorporates a math downside, which includes taking an influence. So, the main_assistant will see this after which delegate the work to the math_assistant. We are able to see this operate name within the pic. Now, the math_assistant will get this consumer question from the main_assistant and creates a operate name to name the facility device with the Base and the Exponent. The reply returned from the operate name is then fed to the main_assistant from the math_assistant. Lastly, the main_assistant retrieves this reply and creates a closing response to the consumer question
Conclusion
In conclusion, Phidata is a straightforward and highly effective Python library designed to simplify the creation of real-world functions utilizing Giant Language Fashions (LLMs). By integrating reminiscence, information bases, and instruments, Phidata permits builders to enhance LLMs’ capabilities, that are labored for under plain textual content era. This information has proven the convenience of constructing with Phidata, working with various kinds of LLMs like OpenAI and Groq, and lengthening functionalities by totally different instruments and assistants. Phidata’s skill to construct structured Outputs, work with Instruments for real-time data retrieval, and create collaborative groups of assistants makes it a go-to device for creating dependable and complicated LLM functions.
Key Takeaways
- Phidata simplifies constructing LLM functions by integrating reminiscence, information, and instruments
- Phidata helps structured outputs utilizing Pydantic fashions, enhancing knowledge dealing with
- Builders can construct groups of assistants that delegate duties to one another for advanced workflows
- Phidata’s CLI utility function permits for interactive, terminal-based conversations with LLMs
- It helps common LLMs like OpenAI, Groq, and Gemini and even integrates with totally different vector shops
The media proven on this article are usually not owned by Analytics Vidhya and is used on the Creator’s discretion.
Steadily Requested Questions
A. Phidata is a Python library designed to simplify constructing real-world functions with Giant Language Fashions (LLMs). It means that you can combine LLMs with reminiscence, information bases, and instruments to create highly effective functions
A. Sure, Phidata can work with totally different LLM outputs. You’ll be able to outline a structured output format utilizing Pydantic fashions and retrieve the LLM response in that format
A. Phidata instruments lengthen LLM capabilities by permitting them to carry out actions like API calls, web searches (utilizing DuckDuckGo), or mathematical calculations (by user-built features)
A. Completely! You’ll be able to outline our personal Python features to carry out particular duties and embrace these instruments to get built-in with the Assistant object
A. Sure, Phidata permits the creation of a number of assistants with assigned roles and instruments. You’ll be able to create a primary assistant that delegates duties to different assistants based mostly on their information of various areas


