As we see the rise of local runnable LLMs, I appreciate how Windsurf IDE operates, but I also prefer to keep my code local. An alternative would be to use the Continue extension in VS Code, but I enjoy using Windsurf more. If possible, could you guide me on how to forward the Windsurf chat API to my localhost? Thank you!
As we see the rise of local runnable LLMs, I appreciate how Windsurf IDE operates, but I also prefer to keep my code local. An alternative would be to use the Continue extension in VS Code, but I enjoy using Windsurf more. If possible, could you guide me on how to forward the Windsurf chat API to my localhost? Thank you!