In the rapidly moving world of AI, blink, and you’ll likely miss at least one (if not several) groundbreaking announcements bringing us one step closer to AGI. Heaven forbid you actually sleep at night.
Last week 2point0 published its first news piece - our valiant effort to curate and summarise the week’s most noteworthy developments in AI. But doing so highlighted a problem: how can a single human being - even a dedicated enthusiast - possibly keep abreast of the latest AI news and trends, whilst simultaneously running their real business, feeding their family, walking their dog, etc.
The answer, of course, was to use the very technology we’re trying to understand. By assembling an expert team of AI analysts, copywriters and editors - in this case powered by Claude 3’s Haiku and Opus models - I was able to turn something that might have taken me days to complete, to something that took minutes.
This experience sparked the inception of a new project, a framework for composing autonomous AI-powered workflows, that I call Shifts.
Announcing Shifts
Shifts is an Elixir framework for composing autonomous agent workflows, using a mixture of LLM backends. Shift’s is heavily inspired by CrewAI - a Python AI agent framework. I wanted a similar tool to exist for Elixir devs, so I started building one.
A Shift is a module that implements a work/2
function, defining a workflow for a given input. There are Workers
who perform Chores
(analogous to CrewAI’s Agent’s and Tasks), and optionally Tool
’s can be used to complete the work.
Work can be described as simple linear workflows, or as complex systems with nested branches, looping and conditionals. For a simple example, here is the (slightly truncated) version of the 2point0 news shift:
defmodule NewsShift do
use Shifts.Shift
# A Shift may have many workers
worker :analyst, role: "News Analyst",
goal: "Create concise summaries of the latest news and trends in the AI space.",
story: "You are an experienced analyst in the AI space...",
llm: {Shifts.LLM.Anthropic, model: "claude-3-haiku-20240307"}
worker :copywriter, role: "Copywriter",
goal: "Write engaging web content for blogs in the AI space.",
story: "You are creative and exceptionally talented copywriter..",
llm: {Shifts.LLM.Anthropic, model: "claude-3-sonnet-20240229"}
# Implementing `work/2` defines the operation of the Shift
@impl true
def work(shift, urls) do
shift
|> each(:analyse, urls, &analyse_news/2)
|> task(:draft, &draft_article/1)
end
defp analyse_news(shift, url) do
task(shift, :summary, [
task: """
Scrape the article at the given URL, analyse the content and create a summary document.
Your analysis should...
URL: #{url}
""",
output: "A summary document comprised of the following structure...",
tools: [ScrapeArticleTool],
worker: :analyst
])
end
defp draft_article(%{analyse: summaries}) do
context =
summaries
|> Enum.map(&String.trim/1)
|> Enum.join("\n\n")
Chore.new([
task: """
Using the given summaries, draft a news digest that covers all of the latest AI news.
You are writing for...
""",
output: "A markdown formatted article. Maximum 1200 words in total.",
context: context,
worker: :copywriter
])
end
end
And this is the ScrapeArticleTool
that the analysts can use:
defmodule ScrapeArticleTool do
use Shifts.Tool
description "Scrapes a URL and returns a web page as readable plain text"
param :url, :string, "URL of the page to scrape"
def call(_shift, %{"url" => url}) do
%{title: title, article_text: body} = Readability.summarize(url)
"# #{title}\n\n#{body}"
end
end
I decided against using the names Agent or Task to avoid ambiguity with Elixir’s standard library modules of the same name. But I like the word “chore” - it implies a burden or drudgery - a necessary but often tedious task that we’d rather offload to someone or something else.
Imagine a world where AI handles the drudgery of data entry, report generation, and routine analysis, leaving you to channel your talents towards problem-solving, strategic thinking, and innovation. This is the promise of Shifts – getting AI to handle the mundane so we can focus on things we care about.
Mix and match LLMs
Shifts is designed to be LLM agnostic, and currently it has adapters for Anthropic’s Claude 3 models, and Hermes 2 Pro running on Ollama (OpenAI coming soon). Developer’s can mix and match LLMs, plugging different models into different parts of the same workflow.
For example, Claude’s Haiku offers great bang for you buck for simple intermediate tasks where you don’t necessarily care about the polish of the final output. Claude’s Opus can then be rolled out for a final “refinement” task to polish up and finish off the work.
When I’m building out a workflow and testing it, I can plug into Hermes 2 Pro through Ollama, saving paying for wasted API calls. And when I’m ready for the real deal, it’s a one-liner to swap to Claude, or soon ChatGPT.
Show me the code!
You can check out Shifts, and my other open source Elixir+AI work, over on GitHub:
- Shifts - Autonomous AI agent workflows
- Anthropix - Unofficial Anthropic API client
- Ollama-ex - Ollama API client and library
I’m excited about the possibilities Shifts enables. But it is in very early development. For now, I’d caution against using it for anything other than kicking the tyres. I’m at that stage where I’m still figuring out the right design and approach for a few of the core mechanics, and breaking changes are certain. There’s lots to work on: metrics (token count), streaming, task delegation, while loops, conditionals, and much more besides.
But it’s already beginning to feel right. As I flesh out the code and add docs, the design will stabilise, and I’ll soon announce when Shifts is ready for more eyes and more users. Stay tuned!