pull down to refresh

Downloaded Ollama 3.9.1 Downloaded codellama:7b-instruct Downloaded codellama:7b-code
Restart my PC and still didn't get the 🤖icon in my vsCode
I think the icon might have changed. It should be beside the extension button.
You'll have to configure twinny to use the local llm.
reply
yh, read through this and used default(Ollama), and when I call it on my terminal it responds, but asides that,nothing else when it comes to the extension tab
Also, from the img above, am I missing something in the config?
reply
Try 127.0.0.1 instead
reply
127.0.0.1 pls can we conclude this on hivetalk.org
Cos then I can share my screen with you
Cool?
reply
Did you manage to figure it out? I took a look at the pic, and I think you are missing an API key from ollama.
reply
where am I to get that from
reply
Driving today, but maybe Monday.
reply
for all hostname?
reply
Yes
reply
Still nothing
reply