[Bug]: Azure OpenAI Adaptor Is Not Working
Introduction
This article reports a bug in the Azure OpenAI adaptor for CodeCompanion, a Neovim plugin. The issue is that the adaptor is not working as expected, resulting in an error when trying to save a chat. This bug report provides detailed information about the issue, including the steps to reproduce it, the expected behavior, and the log output.
Did You Check the Docs and Existing Issues?
Before submitting this issue, I have thoroughly checked the documentation and existing issues related to this problem. I have also updated the plugin to the latest version and searched for existing issues and discussions.
Neovim Version and Operating System
The Neovim version being used is 0.10.3, and the operating system is MacOS 14.3.1.
Describe the Bug
The bug occurs when trying to save a chat using the Azure OpenAI adaptor. The error message is:
E5108: Error executing lua: ...codecompanion.nvim/lua/codecompanion/adapters/openai.lua:41: attempt to index local 'choices' (a nil value)
stack traceback:
...codecompanion.nvim/lua/codecompanion/adapters/openai.lua:41: in function 'setup'
.../nvim/lazy/codecompanion.nvim/lua/codecompanion/http.lua:70: in function 'request'
...ompanion.nvim/lua/codecompanion/strategies/chat/init.lua:755: in function 'submit'
...anion.nvim/lua/codecompanion/strategies/chat/keymaps.lua:216: in function 'rhs'
...y/codecompanion.nvim/lua/codecompanion/utils/keymaps.lua:80: in function <...y/codecompanion.nvim/lua/codecompanion/utils/keymaps.lua:77>
Steps to Reproduce
To reproduce this issue, follow these steps:
- Add the Azure OpenAI adaptor to the CodeCompanion configuration.
- Start a chat using the
:CodeCompanionChat
command. - Write a request in the chat window.
- Press the save short key to save the chat.
Expected Behavior
The expected behavior is that the chat should be saved successfully without any errors.
init.lua File
The init.lua
file is provided below:
--[[
NOTE: Set the config path to enable the copilot adapter to work.
It will search the following paths for a token:
- "$CODECOMPANION_TOKEN_PATH/github-copilot/hosts.json"
- "$CODECOMPANION_TOKEN_PATH/github-copilot/apps.json"
--]]
vim.env["CODECOMPANION_TOKEN_PATH"] = vim.fn.expand("~/.config")
vim.env.LAZY_STDPATH = ".repro"
load(vim.fn.system("curl -s https://raw.githubusercontent.com/folke/lazy.nvim/main/bootstrap.lua"))()
-- Your CodeCompanion setup
local plugins = {
{
"olimorris/codecompanion.nvim",
dependencies = {
{ "nvim-treesitter/nvim-treesitter", build = ":TSUpdate" },
{ "nvim-lua/plenary.nvim" },
-- Test with blinkmp (delete if not required)
{
"saghen/blink.cmp",
lazy = false,
version = "*",
opts = {
keymap = {
preset = "enter",
["<S-Tab>"] = { "select_prev", "fallback" },
["<Tab>"] = { "select_next", "fallback" },
},
cmdline = { sources = { "cmdline" } },
sources = {
default = { "lsp", "path", "buffer", "codecompanion" },
},
},
},
-- Test with nvim-cmp
-- { "hrsh7th/nvim-cmp" },
},
opts = {
--Refer to: https://github.com/olimorris/codecompanion.nvim/blob/main/lua/codecompanion/config.lua
adapters = {
azure_openai = function()
return require("codecompanion.adapters").extend("azure_openai", {
env = {
api_key = os.getenv("AZURE_OPENAI_KEY"),
endpoint = os.getenv("AZURE_OPENAI_ENDPOINT"),
},
schema = {
model = {
default = "gpt-4.1",
},
},
})
end,
},
strategies = {
--NOTE: Change the adapter as required
chat = { adapter = "azure_openai" },
inline = { adapter = "azure_openai" },
},
opts = {
log_level = "DEBUG",
},
},
},
}
-- Leaving this comment in to see if the issue author notices ;-)
-- This is so I can tell if they've really tested with their own repro.lua file
require("lazy.minit").repro({ spec = plugins })
-- Setup Tree-sitter
local ts_status, treesitter = pcall(require, "nvim-treesitter.configs")
if ts_status then
treesitter.setup({
ensure_installed = { "lua", "markdown", "markdown_inline", "yaml", "diff" },
highlight = { enable = true },
})
end
-- Setup nvim-cmp
-- local cmp_status, cmp = pcall(require, "cmp")
-- if cmp_status then
-- cmp.setup({
-- mapping = cmp.mapping.preset.insert({
-- ["<C-b>"] = cmp.mapping.scroll_docs(-4),
-- ["<C-f>"] = cmp.mapping.scroll_docs(4),
-- ["<C-Space>"] = cmp.mapping.complete(),
-- ["<C-e>"] = cmp.mapping.abort(),
-- ["<CR>"] = cmp.mapping.confirm({ select = true }),
-- -- Accept currently selected item. Set `select` to `false` to only confirm explicitly selected items.
-- }),
-- })
-- end
Log Output
The log output is provided below:
[INFO] 2025-04-21 07:04:50
Chat request started
[INFO] 2025-04-21 07:04:50
ELAPSED TIME: 23.096041
[INFO] 2025-04-21 07:08:29
Chat request started
[INFO] 2025-04-21 07:08:29
ELAPSED TIME: 21.595916
[INFO] 2025-04-21 07:09:52
Chat request started
[INFO] 2025-04-21 07:09:52
ELAPSED TIME: 23.141416
[INFO] 2025-04-21 07:25:39
Chat request started
[INFO] 2025-04-21 07:25:39
ELAPSED TIME: 24.304125
Have You Provided and Tested with a repro.lua File?
Yes, I have tested and provided a repro.lua file.
Conclusion
Q: What is the Azure OpenAI Adaptor?
A: The Azure OpenAI Adaptor is a plugin for CodeCompanion, a Neovim plugin, that allows users to interact with the Azure OpenAI API. It provides a way to use the Azure OpenAI API to generate text, answer questions, and perform other tasks.
Q: What is the issue with the Azure OpenAI Adaptor?
A: The issue with the Azure OpenAI Adaptor is that it is not working as expected, resulting in an error when trying to save a chat. The error message is:
E5108: Error executing lua: ...codecompanion.nvim/lua/codecompanion/adapters/openai.lua:41: attempt to index local 'choices' (a nil value)
stack traceback:
...codecompanion.nvim/lua/codecompanion/adapters/openai.lua:41: in function 'setup'
.../nvim/lazy/codecompanion.nvim/lua/codecompanion/http.lua:70: in function 'request'
...ompanion.nvim/lua/codecompanion/strategies/chat/init.lua:755: in function 'submit'
...anion.nvim/lua/codecompanion/strategies/chat/keymaps.lua:216: in function 'rhs'
...y/codecompanion.nvim/lua/codecompanion/utils/keymaps.lua:80: in function <...y/codecompanion.nvim/lua/codecompanion/utils/keymaps.lua:77>
Q: What are the steps to reproduce the issue?
A: To reproduce the issue, follow these steps:
- Add the Azure OpenAI Adaptor to the CodeCompanion configuration.
- Start a chat using the
:CodeCompanionChat
command. - Write a request in the chat window.
- Press the save short key to save the chat.
Q: What is the expected behavior?
A: The expected behavior is that the chat should be saved successfully without any errors.
Q: Have you provided and tested with a repro.lua file?
A: Yes, I have tested and provided a repro.lua file.
Q: What is the log output?
A: The log output is provided below:
[INFO] 2025-04-21 07:04:50
Chat request started
[INFO] 2025-04-21 07:04:50
ELAPSED TIME: 23.096041
[INFO] 2025-04-21 07:08:29
Chat request started
[INFO] 2025-04-21 07:08:29
ELAPSED TIME: 21.595916
[INFO] 2025-04-21 07:09:52
Chat request started
[INFO] 2025-04-21 07:09:52
ELAPSED TIME: 23.141416
[INFO] 2025-04-21 07:25:39
Chat request started
[INFO] 2025-04-21 07:25:39
ELAPSED TIME: 24.304125
Q: What is the solution to the issue?
A: The solution to the issue is to fix the bug in the Azure OpenAI Adaptor or to misconfigure the adaptor. I hope this Q&A article helps to identify and fix the issue.
Q: How can I get help with the issue?
A: If you are experiencing the same issue, you can try the following:
- Check the documentation and existing issues related to the issue.
- Update the plugin to the latest version.
- Search for existing issues and discussions related to the issue.
- Provide a repro.lua file to help identify and fix the issue.
- Contact the plugin author or the community for help.
Q: What is the next step?
A: The next step is to fix the bug in the Azure OpenAI Adaptor or to misconfigure the adaptor. I hope this Q&A article helps to identify and fix the issue.