pull down to refresh
10 sats \ 17 replies \ @random_ 7 Sep 2024 \ parent \ on: NWC & Twitter (X) integration - Cont'd theoretical proposal bitdevs
There's also twinny for vs code.
an extenison?
reply
reply
This was really helpful
Thanks a lot
reply
I wish you nothing but success in your endeavors.
reply
I've followed the docs and still ffailing to get the chat to work
Is there something missing
reply
Tell me what you've done so far.
reply
Downloaded Ollama 3.9.1
Downloaded codellama:7b-instruct
Downloaded codellama:7b-code
Restart my PC and still didn't get the 🤖icon in my vsCode
reply
I think the icon might have changed. It should be beside the extension button.
You'll have to configure twinny to use the local llm.