Examples
Neuron Planner / Executor
This example uses Queuety for orchestration and Neuron for the actual agent execution.
The pattern is:
- a Queuety planner step decides what work exists
spawn_agents()turns those tasks into independent top-level workflows- each child workflow runs a Neuron agent
await_agents()joins the results back into the parent workflow
Why this split works well
Neuron is good at:
- switching providers with minimal code changes
- encapsulating the prompt, tools, and memory for a specialist agent
Queuety is good at:
- durable fan-out and joining
- retries and resumability
- workflow inspection and human gates
Step 1: create a provider-switchable Neuron agent
namespace App\Neuron;
use NeuronAI\Agent\Agent;
use NeuronAI\Providers\AIProviderInterface;
use NeuronAI\Providers\Anthropic\Anthropic;
use NeuronAI\Providers\Gemini\Gemini;
use NeuronAI\Providers\OpenAI\Responses\OpenAIResponses;
use NeuronAI\SystemPrompt;
final class ResearchAgent extends Agent
{
public function __construct(
private readonly string $providerName = 'openai',
) {}
protected function provider(): AIProviderInterface
{
return match ($this->providerName) {
'anthropic' => new Anthropic(
key: $_ENV['ANTHROPIC_API_KEY'],
model: $_ENV['ANTHROPIC_MODEL'],
),
'gemini' => new Gemini(
key: $_ENV['GEMINI_API_KEY'],
model: $_ENV['GEMINI_MODEL'],
),
default => new OpenAIResponses(
key: $_ENV['OPENAI_API_KEY'],
model: $_ENV['OPENAI_MODEL'],
),
};
}
public function instructions(): string
{
return (string) new SystemPrompt(
background: [
'You are a research agent.',
'Return concise, source-aware summaries for the assigned topic.',
],
);
}
}The useful part here is that the workflow can choose the provider at runtime by passing provider in workflow state.
Step 2: run the Neuron agent inside a Queuety step
namespace App\Workflow\Steps;
use App\Neuron\ResearchAgent;
use NeuronAI\Chat\Messages\UserMessage;
use Queuety\Step;
final class ResearchTopicStep implements Step
{
public function handle(array $state): array
{
$agent = new ResearchAgent(
providerName: $state['provider'] ?? 'openai',
);
$message = $agent->chat(
new UserMessage(sprintf(
"Research the topic '%s' for brief %d. Return a compact bullet summary.",
$state['topic'],
$state['brief_id'],
))
)->getMessage();
return [
'summary' => $message->getContent(),
];
}
public function config(): array
{
return [];
}
}This is the key integration point: Neuron handles the LLM call, but Queuety still owns the step boundary and persisted workflow state.
Step 3: let the planner spawn agent runs
use Queuety\Enums\WaitMode;
use Queuety\Queuety;
$agent_workflow = Queuety::workflow('research_agent_run')
->then(ResearchTopicStep::class);
Queuety::workflow('brief_research')
->version('brief-research.v1')
->then(PlanResearchTasksStep::class) // writes $state['agent_tasks']
->spawn_agents('agent_tasks', $agent_workflow)
->await_agents(mode: WaitMode::All, result_key: 'agent_results')
->then(SynthesizeBriefStep::class)
->dispatch([
'brief_id' => 42,
'provider' => 'anthropic',
]);If PlanResearchTasksStep returns:
[
'agent_tasks' => [
['topic' => 'pricing'],
['topic' => 'customer reviews'],
['topic' => 'recent launches'],
],
]Queuety will create one top-level workflow per task, wait until they are all complete, and then pass the finished child workflow state under $state['agent_results'].
Why not just call Neuron in a loop?
You could do that, but you would lose the orchestration advantages:
- each agent run would not be individually inspectable
- a partial crash would likely restart the whole batch
- there would be no clean join point for later stages
This pattern keeps the agent code simple while letting Queuety own the durable orchestration.