1 sat \ 16 replies \ @EverythingSatsoshi OP 7 Sep \ parent \ on: NWC & Twitter (X) integration - Cont'd theoretical proposal bitdevs
an extenison?
reply
This was really helpful
Thanks a lot
I've followed the docs and still ffailing to get the chat to work
Is there something missing
reply
Tell me what you've done so far.
reply
Downloaded Ollama 3.9.1
Downloaded codellama:7b-instruct
Downloaded codellama:7b-code
Restart my PC and still didn't get the 🤖icon in my vsCode
reply
I think the icon might have changed. It should be beside the extension button.
You'll have to configure twinny to use the local llm.
reply
yh, read through this and used default(Ollama), and when I call it on my terminal it responds, but asides that,nothing else when it comes to the extension tab
Also, from the img above, am I missing something in the config?
reply
reply
127.0.0.1 pls can we conclude this on hivetalk.org
Cos then I can share my screen with you
Cool?
for all hostname?