Welcome to avante.nvim Discussions! #368
Replies: 6 comments 11 replies
-
is there a free way to use this? lol |
Beta Was this translation helpful? Give feedback.
-
|
Beta Was this translation helpful? Give feedback.
-
@yetone Amazing project! How long did it take you to build this? |
Beta Was this translation helpful? Give feedback.
-
Is it possible to configure the plugin to work with other local LLMs providers like LM Studio? I've tried configuring it, but my attempts were unsuccessful. |
Beta Was this translation helpful? Give feedback.
-
Just wanna ask if there is a offical Discord? A lot of AI Tooling use Discord to stay in touch. Would be nice if avante also supports this way to talk about releases/ use cases and tips and tricks. |
Beta Was this translation helpful? Give feedback.
-
Fixed!
…On Thu, 31 Oct 2024 at 09:12, Peter Petrov Miroshnikov < ***@***.***> wrote:
Hello,
I have successfully, to a degree, setup Avante with my nvim to use ollama
locally.
The requests to the AI seem to work as expected, but avante does not load
the code from my project or file.
Any question or request i send seems to default to Python, and a fresh new
project without a context.
While i am in the file and check the Repo Map it seems to read the file
and understand it.
But once i open the chat window <ladder>aa - it seems it doesnt catch
none of the buffers.
What could be the issues as this seems to be the main point of the product
😓
{
"yetone/avante.nvim",
event = "VeryLazy",
lazy = true,
version = false, -- set this if you want to always pull the latest change
opts = {
--providers = "copilot",
provider = "ollama",
use_absolute_path = true,
vendors = {
***@***.*** AvanteProvider
ollama = {
["local"] = true,
endpoint = "127.0.0.1:11434/v1",
model = "codegemma",
parse_curl_args = function(opts, code_opts)
return {
url = opts.endpoint .. "/chat/completions",
headers = {
["Accept"] = "application/json",
["Content-Type"] = "application/json",
},
body = {
model = opts.model,
messages = require("avante.providers").copilot.parse_message(code_opts), -- you can make your own message, but this is very advanced
max_tokens = 2048,
stream = true,
},
}
end,
parse_response_data = function(data_stream, event_state, opts)
require("avante.providers").openai.parse_response(data_stream, event_state, opts)
end,
},
},
-- add any opts here
},
-- if you want to build from source then do `make BUILD_FROM_SOURCE=true`
build = "make",
-- build = "powershell -ExecutionPolicy Bypass -File Build.ps1 -BuildFromSource false" -- for windows
dependencies = {
"stevearc/dressing.nvim",
"nvim-lua/plenary.nvim",
"MunifTanjim/nui.nvim",
--- The below dependencies are optional,
"nvim-tree/nvim-web-devicons", -- or echasnovski/mini.icons
"zbirenbaum/copilot.lua", -- for providers='copilot'
{
-- support for image pasting
"HakonHarnes/img-clip.nvim",
event = "VeryLazy",
opts = {
-- recommended settings
default = {
embed_image_as_base64 = false,
prompt_for_file_name = false,
drag_and_drop = {
insert_mode = true,
},
-- required for Windows users
use_absolute_path = true,
},
},
},
{
-- Make sure to set this up properly if you have lazy=true
"MeanderingProgrammer/render-markdown.nvim",
opts = {
file_types = { "markdown", "Avante" },
},
ft = { "markdown", "Avante" },
},
},
},
—
Reply to this email directly, view it on GitHub
<#368 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/BK2XR6SYXFUPU3LWWY74F6DZ6HQ6DAVCNFSM6AAAAABNKWGZBGVHI2DSMVQWIX3LMV43URDJONRXK43TNFXW4Q3PNVWWK3TUHMYTCMJQHA3DKMQ>
.
You are receiving this because you commented.Message ID: <yetone/avante.
***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
👋 Welcome!
We’re using Discussions as a place to connect with other members of our community. We hope that you:
build together 💪.
To get started, comment below with an introduction of yourself and tell us about what you do with this community.
Beta Was this translation helpful? Give feedback.
All reactions