TLDR; You don't need to pay OpenAI to log all your ideas and sell them to your competitors... you can just have a local research assistant without any middlemen.
Everyone and their mom in law seems to be super amazed by ChatGPT Deep Research.
To prove that this isn't magic, I pulled jan-nano - a model refined for MCP on top of Qwen3 - with ollama, and connected it with the ddg-search mcp (because that doesn't KYC.) I then hung both in fast-agent:
fastagent.config.yaml:
mcp:
servers:
search:
command: "npx"
args: ["-y", "@oevortex/ddg_search@latest"]agent.py:
import asyncio
from mcp_agent.core.fastagent import FastAgent
# Create the application
fast = FastAgent("fast-agent")
@fast.agent(name="researcher", servers=["search"], use_history=False, model="generic.jan-nano:latest")
async def main():
async with fast.run() as agent:
await agent.interactive()
if __name__ == "__main__":
asyncio.run(main())I then took the last problem a consultant friend told me he was going to have a 4h explorative meeting about: how to run a docker container orchestrator for a lightweight deployment (without k8s).
╭──────────────────────────────────────────────────────── (researcher) [USER] ─╮
│ │
│ Research what the best orchestrator for docker containers is in │
│ lightweight but diverse deployments spanning at most 3 host servers. │
│ │
╰─ jan-nano:latest turn 1 ─────────────────────────────────────────────────────╯the agent does the <think></think> thing and then:
╭─ [TOOL CALL] ────────────────────────────────────────────────────────────────╮
│ │
│ {"numResults":5,"query":"best Docker orchestrator for lightweight │
│ deployments across 3 host servers"} │
│ │
╰─ [web] [fetch] [url] [felo] ────────────────────────────────────────────────╯
╭────────────────────────────────────────────────────────────── [TOOL RESULT] ─╮
│ │
│ meta=None content=[TextContent(type='text', text='1. [16 Best Container │
│ Orchestration Tools and Services - │
│ DevOpsCube](https://devopscube.com/docker-container-clustering-tools/)\n │
│ In this 2025 updated guide, I have covered the best 16 container │
│ │
╰──────────────────────────────────────────────────────────────────────────────╯and then it spits out a massive response block with a recommendation:
╭─ [ASSISTANT] (researcher) ───────────────────────────────────────────────────╮
│ [..] │
│ **Recommendation**: Use Docker Swarm for its simplicity, integration, and │
│ efficiency in small-scale, diverse deployments. │
│ │
╰─ [search] ──────────────────────────────────────────────────────────────────╯I still think it doesn't do enough research so I'm going to tune this a bit, but... sovereign "deep research" for just the electricity cost on your old Macbook is here. Took about 2 minutes on my 2020 M1.
See, I want to be able to do this, but if I've learned anything from Bitcoin, it's that setting up things like this take me way longer than what you might expect. I know I'll be happy if I just suck it up and figure it out, but...I just need a stronger nudge.
Let me test it with
goose- should work.Took me about 5 minutes to configure it in
goose.I did not know about goose. but then again, i did not know about jan. I've bookmarked your post here and my challenge for the weekend will be to see if I can set up it myself. I'll report back.
Goose is Block Inc's open source AI client: https://github.com/block/goose
Definitely a tangent, but I have a friend at medium biggish tech co and I was surprised to hear they use Goose when they aren't using Claude.
It's hard to know when things that start near/inside the bitcoin bubble get out of it.
Yes! I like goose as a client, because it's extremely versatile for prompt-only things. I also tried using it as a vibe coding thing but I don't really like vibe coding (probably because mainly
c++on the daily, and code spat out is awful squared) so I generally don't touch it in favor of fast-agent because that's programmable.Unfortunately for all the LLM peoples the most exposure I get to prompt-only is when Brave Search interprets my query and shows me an LLM result on top. Which is by the way a genius move to try to stay relevant - by all the search engine corps as they all do it - because it gives better answers than top 10 search results, also on a no-ads engine.