pull down to refresh
0 sats \ 18 replies \ @k00b 7 Sep \ parent \ on: NWC & Twitter (X) integration - Cont'd theoretical proposal bitdevs
I don't use much AI outside of Github Copilot, but I've been wanting to try Cursor. We have some people in Pleb Lab that have managed to do a lot with it.
There's also twinny for vs code.
reply
an extenison?
reply
reply
This was really helpful
Thanks a lot
I've followed the docs and still ffailing to get the chat to work
Is there something missing
reply
Tell me what you've done so far.
reply
Downloaded Ollama 3.9.1
Downloaded codellama:7b-instruct
Downloaded codellama:7b-code
Restart my PC and still didn't get the 🤖icon in my vsCode
reply
I think the icon might have changed. It should be beside the extension button.
You'll have to configure twinny to use the local llm.
reply
yh, read through this and used default(Ollama), and when I call it on my terminal it responds, but asides that,nothing else when it comes to the extension tab
Also, from the img above, am I missing something in the config?