-2.3 C
New York
Wednesday, January 8, 2025

Constructing an Earnings Report Agent with Swarm Framework


Think about for those who may automate the tedious job of analyzing earnings stories, extracting key insights, and making knowledgeable suggestions—all with out lifting a finger. On this article, we’ll stroll you thru learn how to create a multi-agent system utilizing OpenAI’s Swarm framework, designed to deal with these actual duties. You’ll learn to arrange and orchestrate three specialised brokers: one to summarize earnings stories, one other to research sentiment, and a 3rd to generate actionable suggestions. By the tip of this tutorial, you’ll have a scalable, modular answer to streamline monetary evaluation, with potential purposes past simply earnings stories.

Studying Outcomes

  • Perceive the basics of OpenAI’s Swarm framework for multi-agent methods.
  • Discover ways to create brokers for summarizing, sentiment evaluation, and suggestions.
  • Discover the usage of modular brokers for earnings report evaluation.
  • Securely handle API keys utilizing a .env file.
  • Implement a multi-agent system to automate earnings report processing.
  • Acquire insights into real-world purposes of multi-agent methods in finance.
  • Arrange and execute a multi-agent workflow utilizing OpenAI’s Swarm framework.

This text was revealed as part of the Information Science Blogathon.

Building an Earnings Report Agent with Swarm Framework

What’s OpenAI’s Swarm?

Swarm is a light-weight, experimental framework from OpenAI that focuses on multi-agent orchestration. It permits us to coordinate a number of brokers, every dealing with particular duties, like summarizing content material, performing sentiment evaluation, or recommending actions. In our case, we’ll design three brokers:

  • Abstract Agent: Supplies a concise abstract of the earnings report.
  • Sentiment Agent: Analyzes sentiment from the report.
  • Advice Agent: Recommends actions based mostly on sentiment evaluation.

Use Instances and Advantages of Multi-Agent Techniques

You may develop the multi-agent system constructed right here for varied use instances.

  • Portfolio Administration: Automate monitoring of a number of firm stories and recommend portfolio adjustments based mostly on sentiment tendencies.
  • Information Summarization for Finance: Combine real-time information feeds with these brokers to detect potential market actions early.
  • Sentiment Monitoring: Use sentiment evaluation to foretell inventory actions or crypto tendencies based mostly on optimistic or unfavourable market information.

By splitting duties into modular brokers, you possibly can reuse particular person parts throughout completely different tasks, permitting for flexibility and scalability.

Step 1: Setting Up Your Mission Setting

Earlier than we dive into coding, it’s important to put a stable basis for the venture. On this step, you’ll create the mandatory folders and information and set up the required dependencies to get the whole lot operating easily.

mkdir earnings_report
cd earnings_report
mkdir brokers utils
contact essential.py brokers/__init__.py utils/__init__.py .gitignore

Set up Dependencies

pip set up git+https://github.com/openai/swarm.git openai python-dotenv

Step 2: Retailer Your API Key Securely

Safety is vital, particularly when working with delicate knowledge like API keys. This step will information you on learn how to retailer your OpenAI API key securely utilizing a .env file, making certain your credentials are protected and sound.

OPENAI_API_KEY=your-openai-api-key-here

This ensures your API key will not be uncovered in your code.

Step 3: Implement the Brokers

Now, it’s time to deliver your brokers to life! On this step, you’ll create three separate brokers: one for summarizing the earnings report, one other for sentiment evaluation, and a 3rd for producing actionable suggestions based mostly on the sentiment.

Abstract Agent

The Abstract Agent will extract the primary 100 characters of the earnings report as a abstract.

Create brokers/summary_agent.py:

from swarm import Agent

def summarize_report(context_variables):
    report_text = context_variables["report_text"]
    return f"Abstract: {report_text[:100]}..."

summary_agent = Agent(
    title="Abstract Agent",
    directions="Summarize the important thing factors of the earnings report.",
    features=[summarize_report]
)

Sentiment Agent

This agent will test if the phrase “revenue” seems within the report to find out if the sentiment is optimistic.

Create brokers/sentiment_agent.py:

from swarm import Agent

def analyze_sentiment(context_variables):
    report_text = context_variables["report_text"]
    sentiment = "optimistic" if "revenue" in report_text else "unfavourable"
    return f"The sentiment of the report is: {sentiment}"

sentiment_agent = Agent(
    title="Sentiment Agent",
    directions="Analyze the sentiment of the report.",
    features=[analyze_sentiment]
)

Advice Agent

Primarily based on the sentiment, this agent will recommend “Purchase” or “Maintain”.

Create brokers/recommendation_agent.py:

from swarm import Agent

def generate_recommendation(context_variables):
    sentiment = context_variables["sentiment"]
    suggestion = "Purchase" if sentiment == "optimistic" else "Maintain"
    return f"My suggestion is: {suggestion}"

recommendation_agent = Agent(
    title="Advice Agent",
    directions="Suggest actions based mostly on the sentiment evaluation.",
    features=[generate_recommendation]
)

Step 4: Add a Helper Operate for File Loading

Loading knowledge effectively is a vital a part of any venture. Right here, you’ll create a helper perform to streamline the method of studying and loading the earnings report file, making it simpler on your brokers to entry the information.

def load_earnings_report(filepath):
    with open(filepath, "r") as file:
        return file.learn()

Step 5: Tie Every part Collectively in essential.py

Together with your brokers prepared, it’s time to tie the whole lot collectively. On this step, you’ll write the principle script that orchestrates the brokers, permitting them to work in concord to research and supply insights on the earnings report.

from swarm import Swarm
from brokers.summary_agent import summary_agent
from brokers.sentiment_agent import sentiment_agent
from brokers.recommendation_agent import recommendation_agent
from utils.helpers import load_earnings_report
import os
from dotenv import load_dotenv

# Load atmosphere variables from the .env file
load_dotenv()

# Set the OpenAI API key from the atmosphere variable
os.environ['OPENAI_API_KEY'] = os.getenv('OPENAI_API_KEY')

# Initialize Swarm consumer
consumer = Swarm()

# Load earnings report
report_text = load_earnings_report("sample_earnings.txt")

# Run abstract agent
response = consumer.run(
    agent=summary_agent,
    messages=[{"role": "user", "content": "Summarize the report"}],
    context_variables={"report_text": report_text}
)
print(response.messages[-1]["content"])

# Cross abstract to sentiment agent
response = consumer.run(
    agent=sentiment_agent,
    messages=[{"role": "user", "content": "Analyze the sentiment"}],
    context_variables={"report_text": report_text}
)
print(response.messages[-1]["content"])

# Extract sentiment and run suggestion agent
sentiment = response.messages[-1]["content"].break up(": ")[-1].strip()
response = consumer.run(
    agent=recommendation_agent,
    messages=[{"role": "user", "content": "Give a recommendation"}],
    context_variables={"sentiment": sentiment}
)
print(response.messages[-1]["content"])

Step 6: Create a Pattern Earnings Report

To check your system, you want knowledge! This step reveals you learn how to create a pattern earnings report that your brokers can course of, making certain the whole lot is prepared for motion.

Firm XYZ reported a 20% improve in earnings in comparison with the earlier quarter. 
Gross sales grew by 15%, and the corporate expects continued development within the subsequent fiscal yr.

Step 7: Run the Program

Now that the whole lot is about up, it’s time to run this system and watch your multi-agent system in motion because it analyzes the earnings report, performs sentiment evaluation, and presents suggestions.

python essential.py

Anticipated Output:

Run the Program

Conclusion

We’ve constructed a multi-agent answer utilizing OpenAI’s Swarm framework to automate the evaluation of earnings stories. We will course of monetary data and supply actionable suggestions with just some brokers. You may simply lengthen this answer by including new brokers for deeper evaluation or integrating real-time monetary APIs.

Attempt it your self and see how one can improve it with further knowledge sources or brokers for extra superior evaluation!

Key Takeaways

  • Modular Structure: Breaking the system into a number of brokers and utilities retains the code maintainable and scalable.
  • Swarm Framework Energy: Swarm permits clean handoffs between brokers, making it simple to construct complicated multi-agent workflows.
  • Safety by way of .env: Managing API keys with dotenv ensures that delicate knowledge isn’t hardcoded into the venture.
  • This venture can develop to deal with stay monetary knowledge by integrating APIs, enabling it to offer real-time suggestions for buyers.

Often Requested Questions

Q1. What’s OpenAI’s Swarm framework?

A. OpenAI’s Swarm is an experimental framework designed for coordinating a number of brokers to carry out particular duties. It’s superb for constructing modular methods the place every agent has an outlined position, similar to summarizing content material, performing sentiment evaluation, or producing suggestions.

Q2. What are the important thing parts of a multi-agent system?

A. On this tutorial, the multi-agent system consists of three key brokers: the Abstract Agent, Sentiment Agent, and Advice Agent. Every agent performs a selected perform like summarizing an earnings report, analyzing its sentiment, or recommending actions based mostly on sentiment.

Q3. How do I safe my OpenAI API key on this venture?

A. You may retailer your API key securely in a .env file. This fashion, the API key will not be uncovered instantly in your code, sustaining safety. The .env file will be loaded utilizing the python-dotenv package deal.

This fall. Can I develop this venture to deal with stay monetary knowledge?

A. Sure, the venture will be prolonged to deal with stay knowledge by integrating monetary APIs. You may create further brokers to fetch real-time earnings stories and analyze tendencies to offer up-to-date suggestions.

Q5. Can I reuse the brokers in different tasks?

A. Sure, the brokers are designed to be modular, so you possibly can reuse them in different tasks. You may adapt them to completely different duties similar to summarizing information articles, performing textual content sentiment evaluation, or making suggestions based mostly on any type of structured knowledge.

The media proven on this article will not be owned by Analytics Vidhya and is used on the Creator’s discretion.

Hello,
I’m a licensed TensorFlow Developer, GCP Affiliate Engineer, and GCP Machine Studying Engineer.

By way of GCP data, I’ve expertise working with varied GCP companies similar to Compute Engine, Kubernetes Engine, App Engine, Cloud Storage, BigQuery, and Cloud SQL. I’ve expertise with cloud-native knowledge processing instruments similar to Dataflow and Apache Beam. I’m additionally proficient in utilizing Cloud SDK and Cloud Shell for deploying and managing GCP sources. I’ve hands-on expertise in establishing and managing GCP tasks, creating and managing digital machines, configuring load balancers, and managing storage.

By way of machine studying, I’ve expertise working with a variety of algorithms, together with supervised and unsupervised studying, deep studying, and pure language processing. I’ve additionally labored on a wide range of tasks, together with picture classification, sentiment evaluation, and predictive modeling.

As for internet scraping, I’ve expertise utilizing a wide range of instruments and libraries, together with Scrapy, BeautifulSoup, and Selenium. I’ve additionally labored with APIs and might deal with knowledge cleansing, preprocessing, and visualization.

Thanks on your time.

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles