ylai@lemmy.ml to LocalLLaMA@sh.itjust.worksEnglish · 10 months agoMeta releases ‘Code Llama 70B’, an open-source behemoth to rival private AI developmentventurebeat.comexternal-linkmessage-square15fedilinkarrow-up170arrow-down17
arrow-up163arrow-down1external-linkMeta releases ‘Code Llama 70B’, an open-source behemoth to rival private AI developmentventurebeat.comylai@lemmy.ml to LocalLLaMA@sh.itjust.worksEnglish · 10 months agomessage-square15fedilink
minus-squareamzd@kbin.sociallinkfedilinkarrow-up2·10 months agoHugging face have an llm plug-in for code completion in neovim btw!
minus-squarez3rOR0ne@lemmy.mllinkfedilinkEnglisharrow-up1·edit-210 months agoOh nice! Got a link for anyone that comes across this? Save me and others a search plz? EDIT: NM. Got it. Gonna give it a try later. LLM powered development for Neovim
minus-squareamzd@kbin.sociallinkfedilinkarrow-up2·10 months agoIf you use ollama you can try to use the fork that I am using. This is my config to make it work: https://github.com/Amzd/nvim.config/blob/main/lua/plugins/llm.lua
minus-squarez3rOR0ne@lemmy.mllinkfedilinkEnglisharrow-up0·10 months agoNice. Thanks. I’ll save this post in case I use ollama in the future. Right now I use a codellama model and a mythomax model, but am not running them via a localhost server, just outputted in the terminal or LMStudio. This looks interesting though. Thanks!
Hugging face have an llm plug-in for code completion in neovim btw!
Oh nice! Got a link for anyone that comes across this? Save me and others a search plz?
EDIT: NM. Got it. Gonna give it a try later.
LLM powered development for Neovim
If you use ollama you can try to use the fork that I am using. This is my config to make it work: https://github.com/Amzd/nvim.config/blob/main/lua/plugins/llm.lua
Nice. Thanks. I’ll save this post in case I use ollama in the future. Right now I use a codellama model and a mythomax model, but am not running them via a localhost server, just outputted in the terminal or LMStudio.
This looks interesting though. Thanks!