LibreChat/api/server/utils/import
Danny Avila 65990a33e9
📥 fix: Resolve Imported-Conversation Default Model From Runtime modelsConfig (#12885)
* 📥 fix: Use Endpoint-Aware Default Model on Imported Conversations

Claude conversations imported from claude.ai's data export display
"gpt-4o-mini" in the chat UI until the page is refreshed, and any
attempt to send a message before refreshing fails with "The model
'gpt-4o-mini' is not available for Anthropic."

Root cause: ImportBatchBuilder.finishConversation() unconditionally
defaulted the saved conversation's `model` field to
openAISettings.model.default, regardless of `this.endpoint`. Claude
exports don't carry a model name, so every imported Claude conversation
landed with endpoint=anthropic but model=gpt-4o-mini.

Fix: pick the default based on `this.endpoint` via a small lookup
(openAI -> gpt-4o-mini, anthropic -> claude-3-5-sonnet-latest), keeping
the existing OpenAI default as the fallback for unknown endpoints.

Fixes #12844

* 🪄 refactor: Resolve Import Default Model From `modelsConfig`

Replace the hardcoded per-endpoint default lookup added in the previous
commit with a runtime resolver that consults the same models config the
chat UI uses (`getModelsConfig` in ModelController -> `loadDefaultModels`
+ `loadConfigModels`). This way an imported conversation defaults to a
model the LibreChat instance has actually configured / discovered for
the endpoint, instead of a hardcoded constant that may not exist on this
deployment.

Resolution order:
1. First non-empty model in `modelsConfig[endpoint]`.
2. Per-endpoint hardcoded fallback (anthropic/openAI settings) if the
   runtime config is empty for the endpoint or `getModelsConfig` throws.
3. `openAISettings.model.default` if even the per-endpoint fallback is
   missing (unknown endpoint).

`importBatchBuilder.finishConversation` now accepts an optional
`defaultModel` argument; each importer resolves it once at the top via
`resolveImportDefaultModel({ endpoint, requestUserId, userRole })` and
threads it through. ChatGPT message-level model selection also falls
back to the resolved default before the hardcoded gpt-4o-mini.
2026-04-30 00:43:04 -04:00
..
__data__
defaults.js 📥 fix: Resolve Imported-Conversation Default Model From Runtime modelsConfig (#12885) 2026-04-30 00:43:04 -04:00
defaults.spec.js 📥 fix: Resolve Imported-Conversation Default Model From Runtime modelsConfig (#12885) 2026-04-30 00:43:04 -04:00
fork.js 📦 refactor: Consolidate DB models, encapsulating Mongoose usage in data-schemas (#11830) 2026-03-21 14:28:53 -04:00
fork.spec.js 📦 refactor: Consolidate DB models, encapsulating Mongoose usage in data-schemas (#11830) 2026-03-21 14:28:53 -04:00
importBatchBuilder.js 📥 fix: Resolve Imported-Conversation Default Model From Runtime modelsConfig (#12885) 2026-04-30 00:43:04 -04:00
importConversations.js 🏗️ refactor: Remove Redundant Caching, Migrate Config Services to TypeScript (#12466) 2026-03-30 16:49:48 -04:00
importers-timestamp.spec.js 🏗️ refactor: Remove Redundant Caching, Migrate Config Services to TypeScript (#12466) 2026-03-30 16:49:48 -04:00
importers.js 📥 fix: Resolve Imported-Conversation Default Model From Runtime modelsConfig (#12885) 2026-04-30 00:43:04 -04:00
importers.spec.js 📥 fix: Resolve Imported-Conversation Default Model From Runtime modelsConfig (#12885) 2026-04-30 00:43:04 -04:00
index.js