Introduction
Whether or not you are refactoring legacy code, implementing new options, or debugging advanced points, AI coding assistants can speed up your improvement workflow and scale back time-to-delivery. OpenHands is an AI-powered coding framework that acts like an actual improvement associate—it understands advanced necessities, navigates whole codebases, writes and modifies code throughout a number of recordsdata, debugs errors, and may even work together with exterior companies. In contrast to conventional code completion instruments that recommend snippets, OpenHands acts as an autonomous agent able to finishing up full improvement duties from begin to end.
On the mannequin facet, GPT-OSS is OpenAI’s household of open-source massive language fashions constructed for superior reasoning and code technology. These fashions, launched underneath the Apache 2.0 license, deliver capabilities that had been beforehand locked behind proprietary APIs into a totally accessible type. GPT-OSS-20B affords quick responses and modest useful resource necessities, making it well-suited for smaller groups or particular person builders operating fashions regionally.
GPT-OSS-120B delivers deeper reasoning for advanced workflows, large-scale refactoring, and architectural decision-making, and it may be deployed on extra highly effective {hardware} for increased throughput. Each fashions use a mixture-of-experts structure, activating solely the elements of the community wanted for a given request, which helps steadiness effectivity with efficiency.
On this tutorial will information you thru creating an entire native AI coding setup that mixes OpenHands‘ agent capabilities with GPT-OSS fashions.
Tutorial: Constructing Your Native AI Coding Agent
Stipulations
Earlier than we start, guarantee you’ve got the next necessities:
Get a PAT key — To make use of OpenHands with Clarifai fashions, you may want a Private Entry Token (PAT). Log in or join a Clarifai account, then navigate to your Safety settings to generate a brand new PAT.
Get a mannequin — Clarifai’s Group affords a big selection of cutting-edge language fashions which you can run utilizing OpenHands. Browse the neighborhood to discover a mannequin that most closely fits your use case. For this instance, we’ll use the gpt-oss-120b mannequin.
Set up Docker Desktop — OpenHands runs inside a Docker container, so you may want Docker put in and operating in your system. You may obtain and set up Docker Desktop on your working system from the official Docker web site. You’ll want to comply with the set up steps particular to your OS (Home windows, macOS, or Linux).
Step 1: Pull Runtime Picture
OpenHands makes use of a devoted Docker picture to supply a sandboxed execution surroundings. You may pull this picture from the all-hands-ai Docker registry.
Step 2: Run OpenHands
Begin OpenHands utilizing the next complete docker run command.
This command launches a brand new Docker container operating OpenHands with all crucial configurations together with surroundings variables for logging, Docker engine entry for sandboxing, port mapping for internet interface entry on localhost:3000, persistent knowledge storage within the ~/.openhands folder, host communication capabilities, and computerized cleanup when the container exits.
Step 3: Entry the Net Interface
After operating the docker run command, monitor the terminal for log output. As soon as the applying finishes its startup course of, open your most popular internet browser and navigate to: http://localhost:3000
At this level, OpenHands is efficiently put in and operating in your native machine, prepared for configuration.
Step 4: Configure OpenHands with GPT-OSS
To configure OpenHands, open its interface and click on the Settings (gear icon) within the bottom-left nook of the sidebar.
The Settings web page permits you to join OpenHands to a LLM, which serves as its cognitive engine, and combine it with GitHub for model management and collaboration.
Connect with GPT-OSS through Clarifai
Within the Settings web page, go to the LLM tab and toggle the Superior button.
Fill within the following fields for the mannequin integration:
Customized Mannequin — Enter the Clarifai mannequin URL for GPT-OSS-120B. To make sure OpenAI compatibility, prefix the mannequin path with openai/
, adopted by the complete Clarifai mannequin URL: “openai/https://clarifai.com/openai/chat-completion/fashions/gpt-oss-120b”
Base URL — Enter Clarifai’s OpenAI-compatible API endpoint: “https://api.clarifai.com/v2/ext/openai/v1”
API Key — Enter your Clarifai PAT.
After filling within the fields, click on the Save Modifications button on the bottom-right nook of the interface.
Whereas this tutorial focuses on GPT-OSS-120B mannequin, Clarifai’s Group has over 100 open-source and third-party fashions which you can simply entry by way of the identical OpenAI-compatible API. Merely substitute the mannequin URL within the Customized Mannequin discipline with another mannequin from Clarifai’s catalog to experiment with totally different AI capabilities and discover the one that most closely fits your improvement workflow.
Step 5: Combine with GitHub
Inside the similar Settings web page, navigate to the Integrations tab.
Enter your GitHub token within the supplied discipline, then click on Save Modifications within the bottom-right nook of the interface to use the mixing
Step 6: Begin Constructing with AI-Powered Improvement
Subsequent, click on the plus (+) Begin new dialog button on the high of the sidebar. From there, hook up with a repository by choosing your required repo and its department.
As soon as chosen, click on the Launch button to start your coding session with full repository entry.
In the primary interface, use the enter discipline to immediate the agent and start producing your code. The GPT-OSS-120B mannequin will perceive your necessities and supply clever, context-aware help tailor-made to your related repository.
Instance prompts to get began:
- Documentation: “Generate a complete README.md file for this repository that explains the challenge goal, set up steps, and utilization examples.”
- Testing: “Write detailed unit assessments for the consumer authentication capabilities within the auth.py file, together with edge instances and error dealing with situations.”
- Code Enhancement: “Analyze the database connection logic and refactor it to make use of connection pooling for higher efficiency and reliability.”
OpenHands forwards your request to the configured GPT-OSS-120B mannequin, which responds by producing clever code options, explanations, and implementations that perceive your challenge context, and when you’re glad, you possibly can seamlessly push your code to GitHub immediately from the interface, sustaining full model management integration.
Conclusion
You’ve arrange a totally purposeful AI coding agent that runs fully in your native infrastructure utilizing OpenHands and GPT-OSS-120B fashions.
If you wish to use a mannequin operating regionally, you possibly can set it up with native runners. For instance, you possibly can run the GPT-OSS-20B mannequin regionally, expose it as a public API, and use that URL to energy your coding agent. Take a look at the tutorial on operating gpt-oss fashions regionally utilizing native runners right here.
For those who want extra computing energy, you possibly can deploy gpt-oss fashions by yourself devoted machines utilizing compute orchestration after which combine them along with your coding brokers, supplying you with better management over efficiency and useful resource allocation.