Constructing an Earnings Report Agent with Swarm Framework


Think about in case you may automate the tedious activity of analyzing earnings reviews, extracting key insights, and making knowledgeable suggestions—all with out lifting a finger. On this article, we’ll stroll you thru methods to create a multi-agent system utilizing OpenAI’s Swarm framework, designed to deal with these actual duties. You’ll learn to arrange and orchestrate three specialised brokers: one to summarize earnings reviews, one other to research sentiment, and a 3rd to generate actionable suggestions. By the tip of this tutorial, you’ll have a scalable, modular resolution to streamline monetary evaluation, with potential purposes past simply earnings reviews.

Studying Outcomes

  • Perceive the basics of OpenAI’s Swarm framework for multi-agent techniques.
  • Learn to create brokers for summarizing, sentiment evaluation, and proposals.
  • Discover the usage of modular brokers for earnings report evaluation.
  • Securely handle API keys utilizing a .env file.
  • Implement a multi-agent system to automate earnings report processing.
  • Achieve insights into real-world purposes of multi-agent techniques in finance.
  • Arrange and execute a multi-agent workflow utilizing OpenAI’s Swarm framework.

This text was printed as part of the Information Science Blogathon.

Building an Earnings Report Agent with Swarm Framework

What’s OpenAI’s Swarm?

Swarm is a light-weight, experimental framework from OpenAI that focuses on multi-agent orchestration. It permits us to coordinate a number of brokers, every dealing with particular duties, like summarizing content material, performing sentiment evaluation, or recommending actions. In our case, we’ll design three brokers:

  • Abstract Agent: Gives a concise abstract of the earnings report.
  • Sentiment Agent: Analyzes sentiment from the report.
  • Suggestion Agent: Recommends actions based mostly on sentiment evaluation.

Use Circumstances and Advantages of Multi-Agent Methods

You may develop the multi-agent system constructed right here for numerous use instances.

  • Portfolio Administration: Automate monitoring of a number of firm reviews and counsel portfolio adjustments based mostly on sentiment tendencies.
  • Information Summarization for Finance: Combine real-time information feeds with these brokers to detect potential market actions early.
  • Sentiment Monitoring: Use sentiment evaluation to foretell inventory actions or crypto tendencies based mostly on optimistic or unfavorable market information.

By splitting duties into modular brokers, you may reuse particular person parts throughout completely different tasks, permitting for flexibility and scalability.

Step 1: Setting Up Your Undertaking Atmosphere

Earlier than we dive into coding, it’s important to put a strong basis for the mission. On this step, you’ll create the required folders and information and set up the required dependencies to get every little thing operating easily.

mkdir earnings_report
cd earnings_report
mkdir brokers utils
contact foremost.py brokers/__init__.py utils/__init__.py .gitignore

Set up Dependencies

pip set up git+https://github.com/openai/swarm.git openai python-dotenv

Step 2: Retailer Your API Key Securely

Safety is vital, particularly when working with delicate information like API keys. This step will information you on methods to retailer your OpenAI API key securely utilizing a .env file, guaranteeing your credentials are protected and sound.

OPENAI_API_KEY=your-openai-api-key-here

This ensures your API key is just not uncovered in your code.

Step 3: Implement the Brokers

Now, it’s time to carry your brokers to life! On this step, you’ll create three separate brokers: one for summarizing the earnings report, one other for sentiment evaluation, and a 3rd for producing actionable suggestions based mostly on the sentiment.

Abstract Agent

The Abstract Agent will extract the primary 100 characters of the earnings report as a abstract.

Create brokers/summary_agent.py:

from swarm import Agent

def summarize_report(context_variables):
    report_text = context_variables["report_text"]
    return f"Abstract: {report_text[:100]}..."

summary_agent = Agent(
    identify="Abstract Agent",
    directions="Summarize the important thing factors of the earnings report.",
    features=[summarize_report]
)

Sentiment Agent

This agent will verify if the phrase “revenue” seems within the report to find out if the sentiment is optimistic.

Create brokers/sentiment_agent.py:

from swarm import Agent

def analyze_sentiment(context_variables):
    report_text = context_variables["report_text"]
    sentiment = "optimistic" if "revenue" in report_text else "unfavorable"
    return f"The sentiment of the report is: {sentiment}"

sentiment_agent = Agent(
    identify="Sentiment Agent",
    directions="Analyze the sentiment of the report.",
    features=[analyze_sentiment]
)

Suggestion Agent

Based mostly on the sentiment, this agent will counsel “Purchase” or “Maintain”.

Create brokers/recommendation_agent.py:

from swarm import Agent

def generate_recommendation(context_variables):
    sentiment = context_variables["sentiment"]
    advice = "Purchase" if sentiment == "optimistic" else "Maintain"
    return f"My advice is: {advice}"

recommendation_agent = Agent(
    identify="Suggestion Agent",
    directions="Suggest actions based mostly on the sentiment evaluation.",
    features=[generate_recommendation]
)

Step 4: Add a Helper Perform for File Loading

Loading information effectively is a important a part of any mission. Right here, you’ll create a helper perform to streamline the method of studying and loading the earnings report file, making it simpler in your brokers to entry the information.

def load_earnings_report(filepath):
    with open(filepath, "r") as file:
        return file.learn()

Step 5: Tie Every thing Collectively in foremost.py

Together with your brokers prepared, it’s time to tie every little thing collectively. On this step, you’ll write the primary script that orchestrates the brokers, permitting them to work in concord to research and supply insights on the earnings report.

from swarm import Swarm
from brokers.summary_agent import summary_agent
from brokers.sentiment_agent import sentiment_agent
from brokers.recommendation_agent import recommendation_agent
from utils.helpers import load_earnings_report
import os
from dotenv import load_dotenv

# Load setting variables from the .env file
load_dotenv()

# Set the OpenAI API key from the setting variable
os.environ['OPENAI_API_KEY'] = os.getenv('OPENAI_API_KEY')

# Initialize Swarm consumer
consumer = Swarm()

# Load earnings report
report_text = load_earnings_report("sample_earnings.txt")

# Run abstract agent
response = consumer.run(
    agent=summary_agent,
    messages=[{"role": "user", "content": "Summarize the report"}],
    context_variables={"report_text": report_text}
)
print(response.messages[-1]["content"])

# Move abstract to sentiment agent
response = consumer.run(
    agent=sentiment_agent,
    messages=[{"role": "user", "content": "Analyze the sentiment"}],
    context_variables={"report_text": report_text}
)
print(response.messages[-1]["content"])

# Extract sentiment and run advice agent
sentiment = response.messages[-1]["content"].cut up(": ")[-1].strip()
response = consumer.run(
    agent=recommendation_agent,
    messages=[{"role": "user", "content": "Give a recommendation"}],
    context_variables={"sentiment": sentiment}
)
print(response.messages[-1]["content"])

Step 6: Create a Pattern Earnings Report

To check your system, you want information! This step exhibits you methods to create a pattern earnings report that your brokers can course of, guaranteeing every little thing is prepared for motion.

Firm XYZ reported a 20% improve in income in comparison with the earlier quarter. 
Gross sales grew by 15%, and the corporate expects continued development within the subsequent fiscal yr.

Step 7: Run the Program

Now that every little thing is ready up, it’s time to run this system and watch your multi-agent system in motion because it analyzes the earnings report, performs sentiment evaluation, and provides suggestions.

python foremost.py

Anticipated Output:

Run the Program

Conclusion

We’ve constructed a multi-agent resolution utilizing OpenAI’s Swarm framework to automate the evaluation of earnings reviews. We will course of monetary info and provide actionable suggestions with only a few brokers. You may simply lengthen this resolution by including new brokers for deeper evaluation or integrating real-time monetary APIs.

Attempt it your self and see how one can improve it with extra information sources or brokers for extra superior evaluation!

Key Takeaways

  • Modular Structure: Breaking the system into a number of brokers and utilities retains the code maintainable and scalable.
  • Swarm Framework Energy: Swarm permits clean handoffs between brokers, making it straightforward to construct advanced multi-agent workflows.
  • Safety by way of .env: Managing API keys with dotenv ensures that delicate information isn’t hardcoded into the mission.
  • This mission can develop to deal with stay monetary information by integrating APIs, enabling it to offer real-time suggestions for buyers.

Often Requested Questions

Q1. What’s OpenAI’s Swarm framework?

A. OpenAI’s Swarm is an experimental framework designed for coordinating a number of brokers to carry out particular duties. It’s preferrred for constructing modular techniques the place every agent has an outlined position, comparable to summarizing content material, performing sentiment evaluation, or producing suggestions.

Q2. What are the important thing parts of a multi-agent system?

A. On this tutorial, the multi-agent system consists of three key brokers: the Abstract Agent, Sentiment Agent, and Suggestion Agent. Every agent performs a selected perform like summarizing an earnings report, analyzing its sentiment, or recommending actions based mostly on sentiment.

Q3. How do I safe my OpenAI API key on this mission?

A. You may retailer your API key securely in a .env file. This manner, the API key is just not uncovered straight in your code, sustaining safety. The .env file might be loaded utilizing the python-dotenv bundle.

This fall. Can I develop this mission to deal with stay monetary information?

A. Sure, the mission might be prolonged to deal with stay information by integrating monetary APIs. You may create extra brokers to fetch real-time earnings reviews and analyze tendencies to offer up-to-date suggestions.

Q5. Can I reuse the brokers in different tasks?

A. Sure, the brokers are designed to be modular, so you may reuse them in different tasks. You may adapt them to completely different duties comparable to summarizing information articles, performing textual content sentiment evaluation, or making suggestions based mostly on any type of structured information.

The media proven on this article is just not owned by Analytics Vidhya and is used on the Writer’s discretion.

Hello,
I’m a licensed TensorFlow Developer, GCP Affiliate Engineer, and GCP Machine Studying Engineer.

By way of GCP information, I’ve expertise working with numerous GCP companies comparable to Compute Engine, Kubernetes Engine, App Engine, Cloud Storage, BigQuery, and Cloud SQL. I’ve expertise with cloud-native information processing instruments comparable to Dataflow and Apache Beam. I’m additionally proficient in utilizing Cloud SDK and Cloud Shell for deploying and managing GCP sources. I’ve hands-on expertise in organising and managing GCP tasks, creating and managing digital machines, configuring load balancers, and managing storage.

By way of machine studying, I’ve expertise working with a variety of algorithms, together with supervised and unsupervised studying, deep studying, and pure language processing. I’ve additionally labored on a wide range of tasks, together with picture classification, sentiment evaluation, and predictive modeling.

As for internet scraping, I’ve expertise utilizing a wide range of instruments and libraries, together with Scrapy, BeautifulSoup, and Selenium. I’ve additionally labored with APIs and might deal with information cleansing, preprocessing, and visualization.

Thanks in your time.

Leave a Reply

Your email address will not be published. Required fields are marked *