Merge remote-tracking branch 'upstream/main'

This commit is contained in:
Andrew Trokhymenko 2024-11-21 23:31:30 -05:00
commit 70d14df953
22 changed files with 374 additions and 217 deletions

View File

@ -49,6 +49,10 @@ OPENAI_LIKE_API_KEY=
# You only need this environment variable set if you want to use Mistral models # You only need this environment variable set if you want to use Mistral models
MISTRAL_API_KEY= MISTRAL_API_KEY=
# Get the Cohere Api key by following these instructions -
# https://dashboard.cohere.com/api-keys
# You only need this environment variable set if you want to use Cohere models
COHERE_API_KEY=
# Get LMStudio Base URL from LM Studio Developer Console # Get LMStudio Base URL from LM Studio Developer Console
# Make sure to enable CORS # Make sure to enable CORS
@ -62,3 +66,11 @@ XAI_API_KEY=
# Include this environment variable if you want more logging for debugging locally # Include this environment variable if you want more logging for debugging locally
VITE_LOG_LEVEL=debug VITE_LOG_LEVEL=debug
# Example Context Values for qwen2.5-coder:32b
#
# DEFAULT_NUM_CTX=32768 # Consumes 36GB of VRAM
# DEFAULT_NUM_CTX=24576 # Consumes 32GB of VRAM
# DEFAULT_NUM_CTX=12288 # Consumes 26GB of VRAM
# DEFAULT_NUM_CTX=6144 # Consumes 24GB of VRAM
DEFAULT_NUM_CTX=

1
.gitignore vendored
View File

@ -22,6 +22,7 @@ dist-ssr
*.sln *.sln
*.sw? *.sw?
/.history
/.cache /.cache
/build /build
.env.local .env.local

View File

@ -1,4 +1,7 @@
# Contributing to Bolt.new Fork # Contributing to Bolt.new Fork
## DEFAULT_NUM_CTX
The `DEFAULT_NUM_CTX` environment variable can be used to limit the maximum number of context values used by the qwen2.5-coder model. For example, to limit the context to 24576 values (which uses 32GB of VRAM), set `DEFAULT_NUM_CTX=24576` in your `.env.local` file.
First off, thank you for considering contributing to Bolt.new! This fork aims to expand the capabilities of the original project by integrating multiple LLM providers and enhancing functionality. Every contribution helps make Bolt.new a better tool for developers worldwide. First off, thank you for considering contributing to Bolt.new! This fork aims to expand the capabilities of the original project by integrating multiple LLM providers and enhancing functionality. Every contribution helps make Bolt.new a better tool for developers worldwide.
@ -81,6 +84,19 @@ ANTHROPIC_API_KEY=XXX
```bash ```bash
VITE_LOG_LEVEL=debug VITE_LOG_LEVEL=debug
``` ```
- Optionally set context size:
```bash
DEFAULT_NUM_CTX=32768
```
Some Example Context Values for the qwen2.5-coder:32b models are.
* DEFAULT_NUM_CTX=32768 - Consumes 36GB of VRAM
* DEFAULT_NUM_CTX=24576 - Consumes 32GB of VRAM
* DEFAULT_NUM_CTX=12288 - Consumes 26GB of VRAM
* DEFAULT_NUM_CTX=6144 - Consumes 24GB of VRAM
**Important**: Never commit your `.env.local` file to version control. It's already included in .gitignore. **Important**: Never commit your `.env.local` file to version control. It's already included in .gitignore.
### 🚀 Running the Development Server ### 🚀 Running the Development Server

View File

@ -26,6 +26,7 @@ ARG OPEN_ROUTER_API_KEY
ARG GOOGLE_GENERATIVE_AI_API_KEY ARG GOOGLE_GENERATIVE_AI_API_KEY
ARG OLLAMA_API_BASE_URL ARG OLLAMA_API_BASE_URL
ARG VITE_LOG_LEVEL=debug ARG VITE_LOG_LEVEL=debug
ARG DEFAULT_NUM_CTX
ENV WRANGLER_SEND_METRICS=false \ ENV WRANGLER_SEND_METRICS=false \
GROQ_API_KEY=${GROQ_API_KEY} \ GROQ_API_KEY=${GROQ_API_KEY} \
@ -35,7 +36,8 @@ ENV WRANGLER_SEND_METRICS=false \
OPEN_ROUTER_API_KEY=${OPEN_ROUTER_API_KEY} \ OPEN_ROUTER_API_KEY=${OPEN_ROUTER_API_KEY} \
GOOGLE_GENERATIVE_AI_API_KEY=${GOOGLE_GENERATIVE_AI_API_KEY} \ GOOGLE_GENERATIVE_AI_API_KEY=${GOOGLE_GENERATIVE_AI_API_KEY} \
OLLAMA_API_BASE_URL=${OLLAMA_API_BASE_URL} \ OLLAMA_API_BASE_URL=${OLLAMA_API_BASE_URL} \
VITE_LOG_LEVEL=${VITE_LOG_LEVEL} VITE_LOG_LEVEL=${VITE_LOG_LEVEL} \
DEFAULT_NUM_CTX=${DEFAULT_NUM_CTX}
# Pre-configure wrangler to disable metrics # Pre-configure wrangler to disable metrics
RUN mkdir -p /root/.config/.wrangler && \ RUN mkdir -p /root/.config/.wrangler && \
@ -57,6 +59,7 @@ ARG OPEN_ROUTER_API_KEY
ARG GOOGLE_GENERATIVE_AI_API_KEY ARG GOOGLE_GENERATIVE_AI_API_KEY
ARG OLLAMA_API_BASE_URL ARG OLLAMA_API_BASE_URL
ARG VITE_LOG_LEVEL=debug ARG VITE_LOG_LEVEL=debug
ARG DEFAULT_NUM_CTX
ENV GROQ_API_KEY=${GROQ_API_KEY} \ ENV GROQ_API_KEY=${GROQ_API_KEY} \
HuggingFace_API_KEY=${HuggingFace_API_KEY} \ HuggingFace_API_KEY=${HuggingFace_API_KEY} \
@ -65,7 +68,8 @@ ENV GROQ_API_KEY=${GROQ_API_KEY} \
OPEN_ROUTER_API_KEY=${OPEN_ROUTER_API_KEY} \ OPEN_ROUTER_API_KEY=${OPEN_ROUTER_API_KEY} \
GOOGLE_GENERATIVE_AI_API_KEY=${GOOGLE_GENERATIVE_AI_API_KEY} \ GOOGLE_GENERATIVE_AI_API_KEY=${GOOGLE_GENERATIVE_AI_API_KEY} \
OLLAMA_API_BASE_URL=${OLLAMA_API_BASE_URL} \ OLLAMA_API_BASE_URL=${OLLAMA_API_BASE_URL} \
VITE_LOG_LEVEL=${VITE_LOG_LEVEL} VITE_LOG_LEVEL=${VITE_LOG_LEVEL} \
DEFAULT_NUM_CTX=${DEFAULT_NUM_CTX}
RUN mkdir -p ${WORKDIR}/run RUN mkdir -p ${WORKDIR}/run
CMD pnpm run dev --host CMD pnpm run dev --host

View File

@ -1,8 +1,12 @@
[![Bolt.new: AI-Powered Full-Stack Web Development in the Browser](./public/social_preview_index.jpg)](https://bolt.new) [![Bolt.new: AI-Powered Full-Stack Web Development in the Browser](./public/social_preview_index.jpg)](https://bolt.new)
# Bolt.new Fork by Cole Medin # Bolt.new Fork by Cole Medin - oTToDev
This fork of Bolt.new allows you to choose the LLM that you use for each prompt! Currently, you can use OpenAI, Anthropic, Ollama, OpenRouter, Gemini, or Groq models - and it is easily extended to use any other model supported by the Vercel AI SDK! See the instructions below for running this locally and extending it to include more models. This fork of Bolt.new (oTToDev) allows you to choose the LLM that you use for each prompt! Currently, you can use OpenAI, Anthropic, Ollama, OpenRouter, Gemini, LMStudio, Mistral, xAI, HuggingFace, DeepSeek, or Groq models - and it is easily extended to use any other model supported by the Vercel AI SDK! See the instructions below for running this locally and extending it to include more models.
Join the community for oTToDev!
https://thinktank.ottomator.ai
# Requested Additions to this Fork - Feel Free to Contribute!! # Requested Additions to this Fork - Feel Free to Contribute!!
@ -20,21 +24,24 @@ This fork of Bolt.new allows you to choose the LLM that you use for each prompt!
- ✅ Publish projects directly to GitHub (@goncaloalves) - ✅ Publish projects directly to GitHub (@goncaloalves)
- ✅ Ability to enter API keys in the UI (@ali00209) - ✅ Ability to enter API keys in the UI (@ali00209)
- ✅ xAI Grok Beta Integration (@milutinke) - ✅ xAI Grok Beta Integration (@milutinke)
- ✅ LM Studio Integration (@karrot0)
- ✅ HuggingFace Integration (@ahsan3219)
- ✅ Bolt terminal to see the output of LLM run commands (@thecodacus)
- ✅ Streaming of code output (@thecodacus)
- ✅ Ability to revert code to earlier version (@wonderwhy-er)
- ⬜ **HIGH PRIORITY** - Prevent Bolt from rewriting files as often (file locking and diffs) - ⬜ **HIGH PRIORITY** - Prevent Bolt from rewriting files as often (file locking and diffs)
- ⬜ **HIGH PRIORITY** - Better prompting for smaller LLMs (code window sometimes doesn't start) - ⬜ **HIGH PRIORITY** - Better prompting for smaller LLMs (code window sometimes doesn't start)
- ⬜ **HIGH PRIORITY** Load local projects into the app - ⬜ **HIGH PRIORITY** - Load local projects into the app
- ⬜ **HIGH PRIORITY** - Attach images to prompts - ⬜ **HIGH PRIORITY** - Attach images to prompts
- ⬜ **HIGH PRIORITY** - Run agents in the backend as opposed to a single model call - ⬜ **HIGH PRIORITY** - Run agents in the backend as opposed to a single model call
- ⬜ Mobile friendly - ⬜ Mobile friendly
- ⬜ LM Studio Integration
- ⬜ Together Integration - ⬜ Together Integration
- ⬜ Azure Open AI API Integration - ⬜ Azure Open AI API Integration
- ⬜ HuggingFace Integration
- ⬜ Perplexity Integration - ⬜ Perplexity Integration
- ⬜ Vertex AI Integration - ⬜ Vertex AI Integration
- ⬜ Cohere Integration - ✅ Cohere Integration (@hasanraiyan)
- ✅ Dynamic model max token length (@hasanraiyan)
- ⬜ Deploy directly to Vercel/Netlify/other similar platforms - ⬜ Deploy directly to Vercel/Netlify/other similar platforms
- ⬜ Ability to revert code to earlier version
- ⬜ Prompt caching - ⬜ Prompt caching
- ⬜ Better prompt enhancing - ⬜ Better prompt enhancing
- ⬜ Have LLM plan the project in a MD file for better results/transparency - ⬜ Have LLM plan the project in a MD file for better results/transparency

View File

@ -10,11 +10,7 @@ interface APIKeyManagerProps {
labelForGetApiKey?: string; labelForGetApiKey?: string;
} }
export const APIKeyManager: React.FC<APIKeyManagerProps> = ({ export const APIKeyManager: React.FC<APIKeyManagerProps> = ({ provider, apiKey, setApiKey }) => {
provider,
apiKey,
setApiKey,
}) => {
const [isEditing, setIsEditing] = useState(false); const [isEditing, setIsEditing] = useState(false);
const [tempKey, setTempKey] = useState(apiKey); const [tempKey, setTempKey] = useState(apiKey);
@ -24,15 +20,29 @@ export const APIKeyManager: React.FC<APIKeyManagerProps> = ({
}; };
return ( return (
<div className="flex items-center gap-2 mt-2 mb-2"> <div className="flex items-start sm:items-center mt-2 mb-2 flex-col sm:flex-row">
<div>
<span className="text-sm text-bolt-elements-textSecondary">{provider?.name} API Key:</span> <span className="text-sm text-bolt-elements-textSecondary">{provider?.name} API Key:</span>
{!isEditing && (
<div className="flex items-center mb-4">
<span className="flex-1 text-xs text-bolt-elements-textPrimary mr-2">
{apiKey ? '••••••••' : 'Not set (will still work if set in .env file)'}
</span>
<IconButton onClick={() => setIsEditing(true)} title="Edit API Key">
<div className="i-ph:pencil-simple" />
</IconButton>
</div>
)}
</div>
{isEditing ? ( {isEditing ? (
<> <div className="flex items-center gap-3 mt-2">
<input <input
type="password" type="password"
value={tempKey} value={tempKey}
placeholder="Your API Key"
onChange={(e) => setTempKey(e.target.value)} onChange={(e) => setTempKey(e.target.value)}
className="flex-1 p-1 text-sm rounded border border-bolt-elements-borderColor bg-bolt-elements-prompt-background text-bolt-elements-textPrimary focus:outline-none focus:ring-2 focus:ring-bolt-elements-focus" className="flex-1 px-2 py-1 text-xs lg:text-sm rounded border border-bolt-elements-borderColor bg-bolt-elements-prompt-background text-bolt-elements-textPrimary focus:outline-none focus:ring-2 focus:ring-bolt-elements-focus"
/> />
<IconButton onClick={handleSave} title="Save API Key"> <IconButton onClick={handleSave} title="Save API Key">
<div className="i-ph:check" /> <div className="i-ph:check" />
@ -40,20 +50,15 @@ export const APIKeyManager: React.FC<APIKeyManagerProps> = ({
<IconButton onClick={() => setIsEditing(false)} title="Cancel"> <IconButton onClick={() => setIsEditing(false)} title="Cancel">
<div className="i-ph:x" /> <div className="i-ph:x" />
</IconButton> </IconButton>
</> </div>
) : ( ) : (
<> <>
<span className="flex-1 text-sm text-bolt-elements-textPrimary"> {provider?.getApiKeyLink && (
{apiKey ? '••••••••' : 'Not set (will still work if set in .env file)'} <IconButton className="ml-auto" onClick={() => window.open(provider?.getApiKeyLink)} title="Edit API Key">
</span> <span className="mr-2 text-xs lg:text-sm">{provider?.labelForGetApiKey || 'Get API Key'}</span>
<IconButton onClick={() => setIsEditing(true)} title="Edit API Key"> <div className={provider?.icon || 'i-ph:key'} />
<div className="i-ph:pencil-simple" />
</IconButton> </IconButton>
)}
{provider?.getApiKeyLink && <IconButton onClick={() => window.open(provider?.getApiKeyLink)} title="Edit API Key">
<span className="mr-2">{provider?.labelForGetApiKey || 'Get API Key'}</span>
<div className={provider?.icon || "i-ph:key"} />
</IconButton>}
</> </>
)} )}
</div> </div>

View File

@ -29,9 +29,9 @@ const EXAMPLE_PROMPTS = [
const providerList = PROVIDER_LIST; const providerList = PROVIDER_LIST;
const ModelSelector = ({ model, setModel, provider, setProvider, modelList, providerList }) => { const ModelSelector = ({ model, setModel, provider, setProvider, modelList, providerList, apiKeys }) => {
return ( return (
<div className="mb-2 flex gap-2"> <div className="mb-2 flex gap-2 flex-col sm:flex-row">
<select <select
value={provider?.name} value={provider?.name}
onChange={(e) => { onChange={(e) => {
@ -51,8 +51,7 @@ const ModelSelector = ({ model, setModel, provider, setProvider, modelList, prov
key={provider?.name} key={provider?.name}
value={model} value={model}
onChange={(e) => setModel(e.target.value)} onChange={(e) => setModel(e.target.value)}
style={{ maxWidth: '70%' }} className="flex-1 p-2 rounded-lg border border-bolt-elements-borderColor bg-bolt-elements-prompt-background text-bolt-elements-textPrimary focus:outline-none focus:ring-2 focus:ring-bolt-elements-focus transition-all lg:max-w-[70%] "
className="flex-1 p-2 rounded-lg border border-bolt-elements-borderColor bg-bolt-elements-prompt-background text-bolt-elements-textPrimary focus:outline-none focus:ring-2 focus:ring-bolt-elements-focus transition-all"
> >
{[...modelList] {[...modelList]
.filter((e) => e.provider == provider?.name && e.name) .filter((e) => e.provider == provider?.name && e.name)
@ -193,25 +192,25 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
ref={ref} ref={ref}
className={classNames( className={classNames(
styles.BaseChat, styles.BaseChat,
'relative flex h-full w-full overflow-hidden bg-bolt-elements-background-depth-1', 'relative flex flex-col lg:flex-row h-full w-full overflow-hidden bg-bolt-elements-background-depth-1',
)} )}
data-chat-visible={showChat} data-chat-visible={showChat}
> >
<ClientOnly>{() => <Menu />}</ClientOnly> <ClientOnly>{() => <Menu />}</ClientOnly>
<div ref={scrollRef} className="flex overflow-y-auto w-full h-full"> <div ref={scrollRef} className="flex flex-col lg:flex-row overflow-y-auto w-full h-full">
<div className={classNames(styles.Chat, 'flex flex-col flex-grow min-w-[var(--chat-min-width)] h-full')}> <div className={classNames(styles.Chat, 'flex flex-col flex-grow lg:min-w-[var(--chat-min-width)] h-full')}>
{!chatStarted && ( {!chatStarted && (
<div id="intro" className="mt-[26vh] max-w-chat mx-auto text-center"> <div id="intro" className="mt-[26vh] max-w-chat mx-auto text-center px-4 lg:px-0">
<h1 className="text-6xl font-bold text-bolt-elements-textPrimary mb-4 animate-fade-in"> <h1 className="text-3xl lg:text-6xl font-bold text-bolt-elements-textPrimary mb-4 animate-fade-in">
Where ideas begin Where ideas begin
</h1> </h1>
<p className="text-xl mb-8 text-bolt-elements-textSecondary animate-fade-in animation-delay-200"> <p className="text-md lg:text-xl mb-8 text-bolt-elements-textSecondary animate-fade-in animation-delay-200">
Bring ideas to life in seconds or get help on existing projects. Bring ideas to life in seconds or get help on existing projects.
</p> </p>
</div> </div>
)} )}
<div <div
className={classNames('pt-6 px-6', { className={classNames('pt-6 px-2 sm:px-6', {
'h-full flex flex-col': chatStarted, 'h-full flex flex-col': chatStarted,
})} })}
> >
@ -220,7 +219,7 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
return chatStarted ? ( return chatStarted ? (
<Messages <Messages
ref={messageRef} ref={messageRef}
className="flex flex-col w-full flex-1 max-w-chat px-4 pb-6 mx-auto z-1" className="flex flex-col w-full flex-1 max-w-chat pb-6 mx-auto z-1"
messages={messages} messages={messages}
isStreaming={isStreaming} isStreaming={isStreaming}
/> />
@ -228,9 +227,12 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
}} }}
</ClientOnly> </ClientOnly>
<div <div
className={classNames('relative w-full max-w-chat mx-auto z-prompt', { className={classNames(
'sticky bottom-0': chatStarted, ' bg-bolt-elements-background-depth-2 p-3 rounded-lg border border-bolt-elements-borderColor relative w-full max-w-chat mx-auto z-prompt mb-6',
})} {
'sticky bottom-2': chatStarted,
},
)}
> >
<ModelSelector <ModelSelector
key={provider?.name + ':' + modelList.length} key={provider?.name + ':' + modelList.length}
@ -240,7 +242,9 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
provider={provider} provider={provider}
setProvider={setProvider} setProvider={setProvider}
providerList={PROVIDER_LIST} providerList={PROVIDER_LIST}
apiKeys={apiKeys}
/> />
{provider && ( {provider && (
<APIKeyManager <APIKeyManager
provider={provider} provider={provider}
@ -248,7 +252,6 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
setApiKey={(key) => updateApiKey(provider.name, key)} setApiKey={(key) => updateApiKey(provider.name, key)}
/> />
)} )}
<FilePreview <FilePreview
files={uploadedFiles} files={uploadedFiles}
imageDataList={imageDataList} imageDataList={imageDataList}
@ -257,7 +260,6 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
setImageDataList?.(imageDataList.filter((_, i) => i !== index)); setImageDataList?.(imageDataList.filter((_, i) => i !== index));
}} }}
/> />
<div <div
className={classNames( className={classNames(
'shadow-lg border border-bolt-elements-borderColor bg-bolt-elements-prompt-background backdrop-filter backdrop-blur-[8px] rounded-lg overflow-hidden transition-all', 'shadow-lg border border-bolt-elements-borderColor bg-bolt-elements-prompt-background backdrop-filter backdrop-blur-[8px] rounded-lg overflow-hidden transition-all',
@ -265,7 +267,7 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
> >
<textarea <textarea
ref={textareaRef} ref={textareaRef}
className={`w-full pl-4 pt-4 pr-16 focus:outline-none focus:ring-2 focus:ring-bolt-elements-focus resize-none text-md text-bolt-elements-textPrimary placeholder-bolt-elements-textTertiary bg-transparent transition-all`} className={`w-full pl-4 pt-4 pr-16 focus:outline-none focus:ring-0 focus:border-none focus:shadow-none resize-none text-md text-bolt-elements-textPrimary placeholder-bolt-elements-textTertiary bg-transparent transition-all`}
onKeyDown={(event) => { onKeyDown={(event) => {
if (event.key === 'Enter') { if (event.key === 'Enter') {
if (event.shiftKey) { if (event.shiftKey) {
@ -347,7 +349,6 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
) : null} ) : null}
</div> </div>
</div> </div>
<div className="bg-bolt-elements-background-depth-1 pb-6">{/* Ghost Element */}</div>
</div> </div>
</div> </div>
{!chatStarted && ( {!chatStarted && (

View File

@ -4,7 +4,7 @@ import { classNames } from '~/utils/classNames';
import { AssistantMessage } from './AssistantMessage'; import { AssistantMessage } from './AssistantMessage';
import { UserMessage } from './UserMessage'; import { UserMessage } from './UserMessage';
import * as Tooltip from '@radix-ui/react-tooltip'; import * as Tooltip from '@radix-ui/react-tooltip';
import { useLocation, useNavigate } from '@remix-run/react'; import { useLocation } from '@remix-run/react';
import { db, chatId } from '~/lib/persistence/useChatHistory'; import { db, chatId } from '~/lib/persistence/useChatHistory';
import { forkChat } from '~/lib/persistence/db'; import { forkChat } from '~/lib/persistence/db';
import { toast } from 'react-toastify'; import { toast } from 'react-toastify';
@ -19,7 +19,6 @@ interface MessagesProps {
export const Messages = React.forwardRef<HTMLDivElement, MessagesProps>((props: MessagesProps, ref) => { export const Messages = React.forwardRef<HTMLDivElement, MessagesProps>((props: MessagesProps, ref) => {
const { id, isStreaming = false, messages = [] } = props; const { id, isStreaming = false, messages = [] } = props;
const location = useLocation(); const location = useLocation();
const navigate = useNavigate();
const handleRewind = (messageId: string) => { const handleRewind = (messageId: string) => {
const searchParams = new URLSearchParams(location.search); const searchParams = new URLSearchParams(location.search);
@ -69,23 +68,26 @@ export const Messages = React.forwardRef<HTMLDivElement, MessagesProps>((props:
<div className="grid grid-col-1 w-full"> <div className="grid grid-col-1 w-full">
{isUserMessage ? <UserMessage content={content} /> : <AssistantMessage content={content} />} {isUserMessage ? <UserMessage content={content} /> : <AssistantMessage content={content} />}
</div> </div>
{!isUserMessage && (<div className="flex gap-2"> {!isUserMessage && (
<div className="flex gap-2 flex-col lg:flex-row">
<Tooltip.Root> <Tooltip.Root>
<Tooltip.Trigger asChild> <Tooltip.Trigger asChild>
{messageId && (<button {messageId && (
<button
onClick={() => handleRewind(messageId)} onClick={() => handleRewind(messageId)}
key='i-ph:arrow-u-up-left' key="i-ph:arrow-u-up-left"
className={classNames( className={classNames(
'i-ph:arrow-u-up-left', 'i-ph:arrow-u-up-left',
'text-xl text-bolt-elements-textSecondary hover:text-bolt-elements-textPrimary transition-colors' 'text-xl text-bolt-elements-textSecondary hover:text-bolt-elements-textPrimary transition-colors',
)}
/>
)} )}
/>)}
</Tooltip.Trigger> </Tooltip.Trigger>
<Tooltip.Portal> <Tooltip.Portal>
<Tooltip.Content <Tooltip.Content
className="bg-bolt-elements-tooltip-background text-bolt-elements-textPrimary px-3 py-2 rounded-lg text-sm shadow-lg" className="bg-bolt-elements-tooltip-background text-bolt-elements-textPrimary px-3 py-2 rounded-lg text-sm shadow-lg"
sideOffset={5} sideOffset={5}
style={{zIndex: 1000}} style={{ zIndex: 1000 }}
> >
Revert to this message Revert to this message
<Tooltip.Arrow className="fill-bolt-elements-tooltip-background" /> <Tooltip.Arrow className="fill-bolt-elements-tooltip-background" />
@ -97,10 +99,10 @@ export const Messages = React.forwardRef<HTMLDivElement, MessagesProps>((props:
<Tooltip.Trigger asChild> <Tooltip.Trigger asChild>
<button <button
onClick={() => handleFork(messageId)} onClick={() => handleFork(messageId)}
key='i-ph:git-fork' key="i-ph:git-fork"
className={classNames( className={classNames(
'i-ph:git-fork', 'i-ph:git-fork',
'text-xl text-bolt-elements-textSecondary hover:text-bolt-elements-textPrimary transition-colors' 'text-xl text-bolt-elements-textSecondary hover:text-bolt-elements-textPrimary transition-colors',
)} )}
/> />
</Tooltip.Trigger> </Tooltip.Trigger>
@ -108,14 +110,15 @@ export const Messages = React.forwardRef<HTMLDivElement, MessagesProps>((props:
<Tooltip.Content <Tooltip.Content
className="bg-bolt-elements-tooltip-background text-bolt-elements-textPrimary px-3 py-2 rounded-lg text-sm shadow-lg" className="bg-bolt-elements-tooltip-background text-bolt-elements-textPrimary px-3 py-2 rounded-lg text-sm shadow-lg"
sideOffset={5} sideOffset={5}
style={{zIndex: 1000}} style={{ zIndex: 1000 }}
> >
Fork chat from this message Fork chat from this message
<Tooltip.Arrow className="fill-bolt-elements-tooltip-background" /> <Tooltip.Arrow className="fill-bolt-elements-tooltip-background" />
</Tooltip.Content> </Tooltip.Content>
</Tooltip.Portal> </Tooltip.Portal>
</Tooltip.Root> </Tooltip.Root>
</div>)} </div>
)}
</div> </div>
); );
}) })

View File

@ -1,4 +1,5 @@
import { useStore } from '@nanostores/react'; import { useStore } from '@nanostores/react';
import useViewport from '~/lib/hooks';
import { chatStore } from '~/lib/stores/chat'; import { chatStore } from '~/lib/stores/chat';
import { workbenchStore } from '~/lib/stores/workbench'; import { workbenchStore } from '~/lib/stores/workbench';
import { classNames } from '~/utils/classNames'; import { classNames } from '~/utils/classNames';
@ -9,6 +10,8 @@ export function HeaderActionButtons({}: HeaderActionButtonsProps) {
const showWorkbench = useStore(workbenchStore.showWorkbench); const showWorkbench = useStore(workbenchStore.showWorkbench);
const { showChat } = useStore(chatStore); const { showChat } = useStore(chatStore);
const isSmallViewport = useViewport(1024);
const canHideChat = showWorkbench || !showChat; const canHideChat = showWorkbench || !showChat;
return ( return (
@ -16,7 +19,7 @@ export function HeaderActionButtons({}: HeaderActionButtonsProps) {
<div className="flex border border-bolt-elements-borderColor rounded-md overflow-hidden"> <div className="flex border border-bolt-elements-borderColor rounded-md overflow-hidden">
<Button <Button
active={showChat} active={showChat}
disabled={!canHideChat} disabled={!canHideChat || isSmallViewport} // expand button is disabled on mobile as it's needed
onClick={() => { onClick={() => {
if (canHideChat) { if (canHideChat) {
chatStore.setKey('showChat', !showChat); chatStore.setKey('showChat', !showChat);

View File

@ -16,6 +16,7 @@ import { cubicEasingFn } from '~/utils/easings';
import { renderLogger } from '~/utils/logger'; import { renderLogger } from '~/utils/logger';
import { EditorPanel } from './EditorPanel'; import { EditorPanel } from './EditorPanel';
import { Preview } from './Preview'; import { Preview } from './Preview';
import useViewport from '~/lib/hooks';
interface WorkspaceProps { interface WorkspaceProps {
chatStarted?: boolean; chatStarted?: boolean;
@ -65,6 +66,8 @@ export const Workbench = memo(({ chatStarted, isStreaming }: WorkspaceProps) =>
const files = useStore(workbenchStore.files); const files = useStore(workbenchStore.files);
const selectedView = useStore(workbenchStore.currentView); const selectedView = useStore(workbenchStore.currentView);
const isSmallViewport = useViewport(1024);
const setSelectedView = (view: WorkbenchViewType) => { const setSelectedView = (view: WorkbenchViewType) => {
workbenchStore.currentView.set(view); workbenchStore.currentView.set(view);
}; };
@ -128,18 +131,20 @@ export const Workbench = memo(({ chatStarted, isStreaming }: WorkspaceProps) =>
className={classNames( className={classNames(
'fixed top-[calc(var(--header-height)+1.5rem)] bottom-6 w-[var(--workbench-inner-width)] mr-4 z-0 transition-[left,width] duration-200 bolt-ease-cubic-bezier', 'fixed top-[calc(var(--header-height)+1.5rem)] bottom-6 w-[var(--workbench-inner-width)] mr-4 z-0 transition-[left,width] duration-200 bolt-ease-cubic-bezier',
{ {
'w-full': isSmallViewport,
'left-0': showWorkbench && isSmallViewport,
'left-[var(--workbench-left)]': showWorkbench, 'left-[var(--workbench-left)]': showWorkbench,
'left-[100%]': !showWorkbench, 'left-[100%]': !showWorkbench,
}, },
)} )}
> >
<div className="absolute inset-0 px-6"> <div className="absolute inset-0 px-2 lg:px-6">
<div className="h-full flex flex-col bg-bolt-elements-background-depth-2 border border-bolt-elements-borderColor shadow-sm rounded-lg overflow-hidden"> <div className="h-full flex flex-col bg-bolt-elements-background-depth-2 border border-bolt-elements-borderColor shadow-sm rounded-lg overflow-hidden">
<div className="flex items-center px-3 py-2 border-b border-bolt-elements-borderColor"> <div className="flex items-center px-3 py-2 border-b border-bolt-elements-borderColor">
<Slider selected={selectedView} options={sliderOptions} setSelected={setSelectedView} /> <Slider selected={selectedView} options={sliderOptions} setSelected={setSelectedView} />
<div className="ml-auto" /> <div className="ml-auto" />
{selectedView === 'code' && ( {selectedView === 'code' && (
<> <div className="flex overflow-y-auto">
<PanelHeaderButton <PanelHeaderButton
className="mr-1 text-sm" className="mr-1 text-sm"
onClick={() => { onClick={() => {
@ -165,19 +170,22 @@ export const Workbench = memo(({ chatStarted, isStreaming }: WorkspaceProps) =>
<PanelHeaderButton <PanelHeaderButton
className="mr-1 text-sm" className="mr-1 text-sm"
onClick={() => { onClick={() => {
const repoName = prompt("Please enter a name for your new GitHub repository:", "bolt-generated-project"); const repoName = prompt(
'Please enter a name for your new GitHub repository:',
'bolt-generated-project',
);
if (!repoName) { if (!repoName) {
alert("Repository name is required. Push to GitHub cancelled."); alert('Repository name is required. Push to GitHub cancelled.');
return; return;
} }
const githubUsername = prompt("Please enter your GitHub username:"); const githubUsername = prompt('Please enter your GitHub username:');
if (!githubUsername) { if (!githubUsername) {
alert("GitHub username is required. Push to GitHub cancelled."); alert('GitHub username is required. Push to GitHub cancelled.');
return; return;
} }
const githubToken = prompt("Please enter your GitHub personal access token:"); const githubToken = prompt('Please enter your GitHub personal access token:');
if (!githubToken) { if (!githubToken) {
alert("GitHub token is required. Push to GitHub cancelled."); alert('GitHub token is required. Push to GitHub cancelled.');
return; return;
} }
@ -187,7 +195,7 @@ export const Workbench = memo(({ chatStarted, isStreaming }: WorkspaceProps) =>
<div className="i-ph:github-logo" /> <div className="i-ph:github-logo" />
Push to GitHub Push to GitHub
</PanelHeaderButton> </PanelHeaderButton>
</> </div>
)} )}
<IconButton <IconButton
icon="i-ph:x-circle" icon="i-ph:x-circle"

View File

@ -35,6 +35,8 @@ export function getAPIKey(cloudflareEnv: Env, provider: string, userApiKeys?: Re
return env.OPENAI_LIKE_API_KEY || cloudflareEnv.OPENAI_LIKE_API_KEY; return env.OPENAI_LIKE_API_KEY || cloudflareEnv.OPENAI_LIKE_API_KEY;
case "xAI": case "xAI":
return env.XAI_API_KEY || cloudflareEnv.XAI_API_KEY; return env.XAI_API_KEY || cloudflareEnv.XAI_API_KEY;
case "Cohere":
return env.COHERE_API_KEY;
default: default:
return ""; return "";
} }

View File

@ -7,6 +7,11 @@ import { createGoogleGenerativeAI } from '@ai-sdk/google';
import { ollama } from 'ollama-ai-provider'; import { ollama } from 'ollama-ai-provider';
import { createOpenRouter } from "@openrouter/ai-sdk-provider"; import { createOpenRouter } from "@openrouter/ai-sdk-provider";
import { createMistral } from '@ai-sdk/mistral'; import { createMistral } from '@ai-sdk/mistral';
import { createCohere } from '@ai-sdk/cohere'
export const DEFAULT_NUM_CTX = process.env.DEFAULT_NUM_CTX ?
parseInt(process.env.DEFAULT_NUM_CTX, 10) :
32768;
export function getAnthropicModel(apiKey: string, model: string) { export function getAnthropicModel(apiKey: string, model: string) {
const anthropic = createAnthropic({ const anthropic = createAnthropic({
@ -22,14 +27,16 @@ export function getOpenAILikeModel(baseURL: string, apiKey: string, model: strin
baseURL, baseURL,
apiKey, apiKey,
}); });
// console.log('OpenAI client created:', !!openai);
const client = openai(model); return openai(model);
// console.log('OpenAI model client:', !!client); }
return client;
// return { export function getCohereAIModel(apiKey:string, model: string){
// model: client, const cohere = createCohere({
// provider: 'OpenAILike' // Correctly identifying the actual provider apiKey,
// }; });
return cohere(model);
} }
export function getOpenAIModel(apiKey: string, model: string) { export function getOpenAIModel(apiKey: string, model: string) {
@ -76,7 +83,7 @@ export function getHuggingFaceModel(apiKey: string, model: string) {
export function getOllamaModel(baseURL: string, model: string) { export function getOllamaModel(baseURL: string, model: string) {
let Ollama = ollama(model, { let Ollama = ollama(model, {
numCtx: 32768, numCtx: DEFAULT_NUM_CTX,
}); });
Ollama.config.baseURL = `${baseURL}/api`; Ollama.config.baseURL = `${baseURL}/api`;
@ -150,6 +157,8 @@ export function getModel(provider: string, model: string, env: Env, apiKeys?: Re
return getLMStudioModel(baseURL, model); return getLMStudioModel(baseURL, model);
case 'xAI': case 'xAI':
return getXAIModel(apiKey, model); return getXAIModel(apiKey, model);
case 'Cohere':
return getCohereAIModel(apiKey, model);
default: default:
return getOllamaModel(baseURL, model); return getOllamaModel(baseURL, model);
} }

View File

@ -58,7 +58,6 @@ function extractPropertiesFromMessage(message: Message): { model: string; provid
return { model, provider, content: cleanedContent }; return { model, provider, content: cleanedContent };
} }
export function streamText( export function streamText(
messages: Messages, messages: Messages,
env: Env, env: Env,
@ -68,8 +67,6 @@ export function streamText(
let currentModel = DEFAULT_MODEL; let currentModel = DEFAULT_MODEL;
let currentProvider = DEFAULT_PROVIDER; let currentProvider = DEFAULT_PROVIDER;
// console.log('StreamText:', JSON.stringify(messages));
const processedMessages = messages.map((message) => { const processedMessages = messages.map((message) => {
if (message.role === 'user') { if (message.role === 'user') {
const { model, provider, content } = extractPropertiesFromMessage(message); const { model, provider, content } = extractPropertiesFromMessage(message);
@ -83,25 +80,19 @@ export function streamText(
return { ...message, content }; return { ...message, content };
} }
return message; // No changes for non-user messages const modelDetails = MODEL_LIST.find((m) => m.name === currentModel);
});
// console.log('Message content:', messages[0].content); const dynamicMaxTokens =
// console.log('Extracted properties:', extractPropertiesFromMessage(messages[0])); modelDetails && modelDetails.maxTokenAllowed
? modelDetails.maxTokenAllowed
: MAX_TOKENS;
const llmClient = getModel(currentProvider, currentModel, env, apiKeys); return _streamText({
// console.log('LLM Client:', llmClient); model: getModel(currentProvider, currentModel, env, apiKeys),
const llmConfig = {
...options,
model: llmClient, //getModel(currentProvider, currentModel, env, apiKeys),
provider: currentProvider,
system: getSystemPrompt(), system: getSystemPrompt(),
maxTokens: MAX_TOKENS, maxTokens: dynamicMaxTokens,
messages: convertToCoreMessages(processedMessages), messages: convertToCoreMessages(processedMessages),
}; ...options,
});
// console.log('LLM Config:', llmConfig); }
)}
return _streamText(llmConfig);
}

View File

@ -2,3 +2,4 @@ export * from './useMessageParser';
export * from './usePromptEnhancer'; export * from './usePromptEnhancer';
export * from './useShortcuts'; export * from './useShortcuts';
export * from './useSnapScroll'; export * from './useSnapScroll';
export { default } from './useViewport';

View File

@ -1,4 +1,5 @@
import { useState } from 'react'; import { useState } from 'react';
import type { ProviderInfo } from '~/types/model';
import { createScopedLogger } from '~/utils/logger'; import { createScopedLogger } from '~/utils/logger';
const logger = createScopedLogger('usePromptEnhancement'); const logger = createScopedLogger('usePromptEnhancement');
@ -16,8 +17,8 @@ export function usePromptEnhancer() {
input: string, input: string,
setInput: (value: string) => void, setInput: (value: string) => void,
model: string, model: string,
provider: string, provider: ProviderInfo,
apiKeys?: Record<string, string> apiKeys?: Record<string, string>,
) => { ) => {
setEnhancingPrompt(true); setEnhancingPrompt(true);
setPromptEnhanced(false); setPromptEnhanced(false);

View File

@ -0,0 +1,18 @@
import { useState, useEffect } from 'react';
const useViewport = (threshold = 1024) => {
const [isSmallViewport, setIsSmallViewport] = useState(window.innerWidth < threshold);
useEffect(() => {
const handleResize = () => setIsSmallViewport(window.innerWidth < threshold);
window.addEventListener('resize', handleResize);
return () => {
window.removeEventListener('resize', handleResize);
};
}, [threshold]);
return isSmallViewport;
};
export default useViewport;

View File

@ -2,7 +2,7 @@ import { type ActionFunctionArgs } from '@remix-run/cloudflare';
import { StreamingTextResponse, parseStreamPart } from 'ai'; import { StreamingTextResponse, parseStreamPart } from 'ai';
import { streamText } from '~/lib/.server/llm/stream-text'; import { streamText } from '~/lib/.server/llm/stream-text';
import { stripIndents } from '~/utils/stripIndent'; import { stripIndents } from '~/utils/stripIndent';
import type { StreamingOptions } from '~/lib/.server/llm/stream-text'; import type { ProviderInfo } from '~/types/model';
const encoder = new TextEncoder(); const encoder = new TextEncoder();
const decoder = new TextDecoder(); const decoder = new TextDecoder();
@ -15,22 +15,24 @@ async function enhancerAction({ context, request }: ActionFunctionArgs) {
const { message, model, provider, apiKeys } = await request.json<{ const { message, model, provider, apiKeys } = await request.json<{
message: string; message: string;
model: string; model: string;
provider: string; provider: ProviderInfo;
apiKeys?: Record<string, string>; apiKeys?: Record<string, string>;
}>(); }>();
// Validate 'model' and 'provider' fields const { name: providerName } = provider;
// validate 'model' and 'provider' fields
if (!model || typeof model !== 'string') { if (!model || typeof model !== 'string') {
throw new Response('Invalid or missing model', { throw new Response('Invalid or missing model', {
status: 400, status: 400,
statusText: 'Bad Request' statusText: 'Bad Request',
}); });
} }
if (!provider || typeof provider !== 'string') { if (!providerName || typeof providerName !== 'string') {
throw new Response('Invalid or missing provider', { throw new Response('Invalid or missing provider', {
status: 400, status: 400,
statusText: 'Bad Request' statusText: 'Bad Request',
}); });
} }
@ -39,7 +41,9 @@ async function enhancerAction({ context, request }: ActionFunctionArgs) {
[ [
{ {
role: 'user', role: 'user',
content: `[Model: ${model}]\n\n[Provider: ${provider}]\n\n` + stripIndents` content:
`[Model: ${model}]\n\n[Provider: ${providerName}]\n\n` +
stripIndents`
I want you to improve the user prompt that is wrapped in \`<original_prompt>\` tags. I want you to improve the user prompt that is wrapped in \`<original_prompt>\` tags.
IMPORTANT: Only respond with the improved prompt and nothing else! IMPORTANT: Only respond with the improved prompt and nothing else!
@ -52,23 +56,24 @@ async function enhancerAction({ context, request }: ActionFunctionArgs) {
], ],
context.cloudflare.env, context.cloudflare.env,
undefined, undefined,
apiKeys apiKeys,
); );
const transformStream = new TransformStream({ const transformStream = new TransformStream({
transform(chunk, controller) { transform(chunk, controller) {
const text = decoder.decode(chunk); const text = decoder.decode(chunk);
const lines = text.split('\n').filter(line => line.trim() !== ''); const lines = text.split('\n').filter((line) => line.trim() !== '');
for (const line of lines) { for (const line of lines) {
try { try {
const parsed = parseStreamPart(line); const parsed = parseStreamPart(line);
if (parsed.type === 'text') { if (parsed.type === 'text') {
controller.enqueue(encoder.encode(parsed.value)); controller.enqueue(encoder.encode(parsed.value));
} }
} catch (e) { } catch (e) {
// Skip invalid JSON lines // skip invalid JSON lines
console.warn('Failed to parse stream part:', line); console.warn('Failed to parse stream part:', line, e);
} }
} }
}, },
@ -83,7 +88,7 @@ async function enhancerAction({ context, request }: ActionFunctionArgs) {
if (error instanceof Error && error.message?.includes('API key')) { if (error instanceof Error && error.message?.includes('API key')) {
throw new Response('Invalid or missing API key', { throw new Response('Invalid or missing API key', {
status: 401, status: 401,
statusText: 'Unauthorized' statusText: 'Unauthorized',
}); });
} }

View File

@ -12,12 +12,12 @@ const PROVIDER_LIST: ProviderInfo[] = [
{ {
name: 'Anthropic', name: 'Anthropic',
staticModels: [ staticModels: [
{ name: 'claude-3-5-sonnet-latest', label: 'Claude 3.5 Sonnet (new)', provider: 'Anthropic' }, { name: 'claude-3-5-sonnet-latest', label: 'Claude 3.5 Sonnet (new)', provider: 'Anthropic', maxTokenAllowed: 8000 },
{ name: 'claude-3-5-sonnet-20240620', label: 'Claude 3.5 Sonnet (old)', provider: 'Anthropic' }, { name: 'claude-3-5-sonnet-20240620', label: 'Claude 3.5 Sonnet (old)', provider: 'Anthropic', maxTokenAllowed: 8000 },
{ name: 'claude-3-5-haiku-latest', label: 'Claude 3.5 Haiku (new)', provider: 'Anthropic' }, { name: 'claude-3-5-haiku-latest', label: 'Claude 3.5 Haiku (new)', provider: 'Anthropic', maxTokenAllowed: 8000 },
{ name: 'claude-3-opus-latest', label: 'Claude 3 Opus', provider: 'Anthropic' }, { name: 'claude-3-opus-latest', label: 'Claude 3 Opus', provider: 'Anthropic', maxTokenAllowed: 8000 },
{ name: 'claude-3-sonnet-20240229', label: 'Claude 3 Sonnet', provider: 'Anthropic' }, { name: 'claude-3-sonnet-20240229', label: 'Claude 3 Sonnet', provider: 'Anthropic', maxTokenAllowed: 8000 },
{ name: 'claude-3-haiku-20240307', label: 'Claude 3 Haiku', provider: 'Anthropic' } { name: 'claude-3-haiku-20240307', label: 'Claude 3 Haiku', provider: 'Anthropic', maxTokenAllowed: 8000 }
], ],
getApiKeyLink: "https://console.anthropic.com/settings/keys", getApiKeyLink: "https://console.anthropic.com/settings/keys",
}, },
@ -36,23 +36,40 @@ const PROVIDER_LIST: ProviderInfo[] = [
], ],
getDynamicModels: getOpenAILikeModels getDynamicModels: getOpenAILikeModels
}, },
{
name: 'Cohere',
staticModels: [
{ name: 'command-r-plus-08-2024', label: 'Command R plus Latest', provider: 'Cohere', maxTokenAllowed: 4096 },
{ name: 'command-r-08-2024', label: 'Command R Latest', provider: 'Cohere', maxTokenAllowed: 4096 },
{ name: 'command-r-plus', label: 'Command R plus', provider: 'Cohere', maxTokenAllowed: 4096 },
{ name: 'command-r', label: 'Command R', provider: 'Cohere', maxTokenAllowed: 4096 },
{ name: 'command', label: 'Command', provider: 'Cohere', maxTokenAllowed: 4096 },
{ name: 'command-nightly', label: 'Command Nightly', provider: 'Cohere', maxTokenAllowed: 4096 },
{ name: 'command-light', label: 'Command Light', provider: 'Cohere', maxTokenAllowed: 4096 },
{ name: 'command-light-nightly', label: 'Command Light Nightly', provider: 'Cohere', maxTokenAllowed: 4096 },
{ name: 'c4ai-aya-expanse-8b', label: 'c4AI Aya Expanse 8b', provider: 'Cohere', maxTokenAllowed: 4096 },
{ name: 'c4ai-aya-expanse-32b', label: 'c4AI Aya Expanse 32b', provider: 'Cohere', maxTokenAllowed: 4096 },
],
getApiKeyLink: 'https://dashboard.cohere.com/api-keys'
},
{ {
name: 'OpenRouter', name: 'OpenRouter',
staticModels: [ staticModels: [
{ name: 'gpt-4o', label: 'GPT-4o', provider: 'OpenRouter' }, { name: 'gpt-4o', label: 'GPT-4o', provider: 'OpenAI', maxTokenAllowed: 8000 },
{ {
name: 'anthropic/claude-3.5-sonnet', name: 'anthropic/claude-3.5-sonnet',
label: 'Anthropic: Claude 3.5 Sonnet (OpenRouter)', label: 'Anthropic: Claude 3.5 Sonnet (OpenRouter)',
provider: 'OpenRouter' provider: 'OpenRouter'
, maxTokenAllowed: 8000
}, },
{ name: 'anthropic/claude-3-haiku', label: 'Anthropic: Claude 3 Haiku (OpenRouter)', provider: 'OpenRouter' }, { name: 'anthropic/claude-3-haiku', label: 'Anthropic: Claude 3 Haiku (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 8000 },
{ name: 'deepseek/deepseek-coder', label: 'Deepseek-Coder V2 236B (OpenRouter)', provider: 'OpenRouter' }, { name: 'deepseek/deepseek-coder', label: 'Deepseek-Coder V2 236B (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 8000 },
{ name: 'google/gemini-flash-1.5', label: 'Google Gemini Flash 1.5 (OpenRouter)', provider: 'OpenRouter' }, { name: 'google/gemini-flash-1.5', label: 'Google Gemini Flash 1.5 (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 8000 },
{ name: 'google/gemini-pro-1.5', label: 'Google Gemini Pro 1.5 (OpenRouter)', provider: 'OpenRouter' }, { name: 'google/gemini-pro-1.5', label: 'Google Gemini Pro 1.5 (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 8000 },
{ name: 'x-ai/grok-beta', label: 'xAI Grok Beta (OpenRouter)', provider: 'OpenRouter' }, { name: 'x-ai/grok-beta', label: 'xAI Grok Beta (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 8000 },
{ name: 'mistralai/mistral-nemo', label: 'OpenRouter Mistral Nemo (OpenRouter)', provider: 'OpenRouter' }, { name: 'mistralai/mistral-nemo', label: 'OpenRouter Mistral Nemo (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 8000 },
{ name: 'qwen/qwen-110b-chat', label: 'OpenRouter Qwen 110b Chat (OpenRouter)', provider: 'OpenRouter' }, { name: 'qwen/qwen-110b-chat', label: 'OpenRouter Qwen 110b Chat (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 8000 },
{ name: 'cohere/command', label: 'Cohere Command (OpenRouter)', provider: 'OpenRouter' } { name: 'cohere/command', label: 'Cohere Command (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 4096 }
], ],
getDynamicModels: getOpenRouterModels, getDynamicModels: getOpenRouterModels,
getApiKeyLink: 'https://openrouter.ai/settings/keys', getApiKeyLink: 'https://openrouter.ai/settings/keys',
@ -69,21 +86,21 @@ const PROVIDER_LIST: ProviderInfo[] = [
}, { }, {
name: 'Groq', name: 'Groq',
staticModels: [ staticModels: [
{ name: 'llama-3.1-70b-versatile', label: 'Llama 3.1 70b (Groq)', provider: 'Groq' }, { name: 'llama-3.1-70b-versatile', label: 'Llama 3.1 70b (Groq)', provider: 'Groq', maxTokenAllowed: 8000 },
{ name: 'llama-3.1-8b-instant', label: 'Llama 3.1 8b (Groq)', provider: 'Groq' }, { name: 'llama-3.1-8b-instant', label: 'Llama 3.1 8b (Groq)', provider: 'Groq', maxTokenAllowed: 8000 },
{ name: 'llama-3.2-11b-vision-preview', label: 'Llama 3.2 11b (Groq)', provider: 'Groq' }, { name: 'llama-3.2-11b-vision-preview', label: 'Llama 3.2 11b (Groq)', provider: 'Groq', maxTokenAllowed: 8000 },
{ name: 'llama-3.2-3b-preview', label: 'Llama 3.2 3b (Groq)', provider: 'Groq' }, { name: 'llama-3.2-3b-preview', label: 'Llama 3.2 3b (Groq)', provider: 'Groq', maxTokenAllowed: 8000 },
{ name: 'llama-3.2-1b-preview', label: 'Llama 3.2 1b (Groq)', provider: 'Groq' } { name: 'llama-3.2-1b-preview', label: 'Llama 3.2 1b (Groq)', provider: 'Groq', maxTokenAllowed: 8000 }
], ],
getApiKeyLink: 'https://console.groq.com/keys' getApiKeyLink: 'https://console.groq.com/keys'
}, },
{ {
name: 'HuggingFace', name: 'HuggingFace',
staticModels: [ staticModels: [
{ name: 'Qwen/Qwen2.5-Coder-32B-Instruct', label: 'Qwen2.5-Coder-32B-Instruct (HuggingFace)', provider: 'HuggingFace' }, { name: 'Qwen/Qwen2.5-Coder-32B-Instruct', label: 'Qwen2.5-Coder-32B-Instruct (HuggingFace)', provider: 'HuggingFace', maxTokenAllowed: 8000 },
{ name: '01-ai/Yi-1.5-34B-Chat', label: 'Yi-1.5-34B-Chat (HuggingFace)', provider: 'HuggingFace' }, { name: '01-ai/Yi-1.5-34B-Chat', label: 'Yi-1.5-34B-Chat (HuggingFace)', provider: 'HuggingFace', maxTokenAllowed: 8000 },
{ name: 'codellama/CodeLlama-34b-Instruct-hf', label: 'CodeLlama-34b-Instruct (HuggingFace)', provider: 'HuggingFace' }, { name: 'codellama/CodeLlama-34b-Instruct-hf', label: 'CodeLlama-34b-Instruct (HuggingFace)', provider: 'HuggingFace', maxTokenAllowed: 8000 },
{ name: 'NousResearch/Hermes-3-Llama-3.1-8B', label: 'Hermes-3-Llama-3.1-8B (HuggingFace)', provider: 'HuggingFace' } { name: 'NousResearch/Hermes-3-Llama-3.1-8B', label: 'Hermes-3-Llama-3.1-8B (HuggingFace)', provider: 'HuggingFace', maxTokenAllowed: 8000 }
], ],
getApiKeyLink: 'https://huggingface.co/settings/tokens' getApiKeyLink: 'https://huggingface.co/settings/tokens'
}, },
@ -91,37 +108,37 @@ const PROVIDER_LIST: ProviderInfo[] = [
{ {
name: 'OpenAI', name: 'OpenAI',
staticModels: [ staticModels: [
{ name: 'gpt-4o-mini', label: 'GPT-4o Mini', provider: 'OpenAI' }, { name: 'gpt-4o-mini', label: 'GPT-4o Mini', provider: 'OpenAI', maxTokenAllowed: 8000 },
{ name: 'gpt-4-turbo', label: 'GPT-4 Turbo', provider: 'OpenAI' }, { name: 'gpt-4-turbo', label: 'GPT-4 Turbo', provider: 'OpenAI', maxTokenAllowed: 8000 },
{ name: 'gpt-4', label: 'GPT-4', provider: 'OpenAI' }, { name: 'gpt-4', label: 'GPT-4', provider: 'OpenAI', maxTokenAllowed: 8000 },
{ name: 'gpt-3.5-turbo', label: 'GPT-3.5 Turbo', provider: 'OpenAI' } { name: 'gpt-3.5-turbo', label: 'GPT-3.5 Turbo', provider: 'OpenAI', maxTokenAllowed: 8000 }
], ],
getApiKeyLink: "https://platform.openai.com/api-keys", getApiKeyLink: "https://platform.openai.com/api-keys",
}, { }, {
name: 'xAI', name: 'xAI',
staticModels: [ staticModels: [
{ name: 'grok-beta', label: 'xAI Grok Beta', provider: 'xAI' } { name: 'grok-beta', label: 'xAI Grok Beta', provider: 'xAI', maxTokenAllowed: 8000 }
], ],
getApiKeyLink: 'https://docs.x.ai/docs/quickstart#creating-an-api-key' getApiKeyLink: 'https://docs.x.ai/docs/quickstart#creating-an-api-key'
}, { }, {
name: 'Deepseek', name: 'Deepseek',
staticModels: [ staticModels: [
{ name: 'deepseek-coder', label: 'Deepseek-Coder', provider: 'Deepseek' }, { name: 'deepseek-coder', label: 'Deepseek-Coder', provider: 'Deepseek', maxTokenAllowed: 8000 },
{ name: 'deepseek-chat', label: 'Deepseek-Chat', provider: 'Deepseek' } { name: 'deepseek-chat', label: 'Deepseek-Chat', provider: 'Deepseek', maxTokenAllowed: 8000 }
], ],
getApiKeyLink: 'https://platform.deepseek.com/api_keys' getApiKeyLink: 'https://platform.deepseek.com/api_keys'
}, { }, {
name: 'Mistral', name: 'Mistral',
staticModels: [ staticModels: [
{ name: 'open-mistral-7b', label: 'Mistral 7B', provider: 'Mistral' }, { name: 'open-mistral-7b', label: 'Mistral 7B', provider: 'Mistral', maxTokenAllowed: 8000 },
{ name: 'open-mixtral-8x7b', label: 'Mistral 8x7B', provider: 'Mistral' }, { name: 'open-mixtral-8x7b', label: 'Mistral 8x7B', provider: 'Mistral', maxTokenAllowed: 8000 },
{ name: 'open-mixtral-8x22b', label: 'Mistral 8x22B', provider: 'Mistral' }, { name: 'open-mixtral-8x22b', label: 'Mistral 8x22B', provider: 'Mistral', maxTokenAllowed: 8000 },
{ name: 'open-codestral-mamba', label: 'Codestral Mamba', provider: 'Mistral' }, { name: 'open-codestral-mamba', label: 'Codestral Mamba', provider: 'Mistral', maxTokenAllowed: 8000 },
{ name: 'open-mistral-nemo', label: 'Mistral Nemo', provider: 'Mistral' }, { name: 'open-mistral-nemo', label: 'Mistral Nemo', provider: 'Mistral', maxTokenAllowed: 8000 },
{ name: 'ministral-8b-latest', label: 'Mistral 8B', provider: 'Mistral' }, { name: 'ministral-8b-latest', label: 'Mistral 8B', provider: 'Mistral', maxTokenAllowed: 8000 },
{ name: 'mistral-small-latest', label: 'Mistral Small', provider: 'Mistral' }, { name: 'mistral-small-latest', label: 'Mistral Small', provider: 'Mistral', maxTokenAllowed: 8000 },
{ name: 'codestral-latest', label: 'Codestral', provider: 'Mistral' }, { name: 'codestral-latest', label: 'Codestral', provider: 'Mistral', maxTokenAllowed: 8000 },
{ name: 'mistral-large-latest', label: 'Mistral Large Latest', provider: 'Mistral' } { name: 'mistral-large-latest', label: 'Mistral Large Latest', provider: 'Mistral', maxTokenAllowed: 8000 }
], ],
getApiKeyLink: 'https://console.mistral.ai/api-keys/' getApiKeyLink: 'https://console.mistral.ai/api-keys/'
}, { }, {
@ -165,7 +182,8 @@ async function getOllamaModels(): Promise<ModelInfo[]> {
return data.models.map((model: OllamaModel) => ({ return data.models.map((model: OllamaModel) => ({
name: model.name, name: model.name,
label: `${model.name} (${model.details.parameter_size})`, label: `${model.name} (${model.details.parameter_size})`,
provider: 'Ollama' provider: 'Ollama',
maxTokenAllowed:8000,
})); }));
} catch (e) { } catch (e) {
return []; return [];
@ -219,7 +237,8 @@ async function getOpenRouterModels(): Promise<ModelInfo[]> {
label: `${m.name} - in:$${(m.pricing.prompt * 1_000_000).toFixed( label: `${m.name} - in:$${(m.pricing.prompt * 1_000_000).toFixed(
2)} out:$${(m.pricing.completion * 1_000_000).toFixed(2)} - context ${Math.floor( 2)} out:$${(m.pricing.completion * 1_000_000).toFixed(2)} - context ${Math.floor(
m.context_length / 1000)}k`, m.context_length / 1000)}k`,
provider: 'OpenRouter' provider: 'OpenRouter',
maxTokenAllowed:8000,
})); }));
} }

View File

@ -25,6 +25,7 @@ export interface ModelInfo {
name: string; name: string;
label: string; label: string;
provider: string; provider: string;
maxTokenAllowed: number;
} }
export interface ProviderInfo { export interface ProviderInfo {

View File

@ -21,6 +21,7 @@ services:
- GOOGLE_GENERATIVE_AI_API_KEY=${GOOGLE_GENERATIVE_AI_API_KEY} - GOOGLE_GENERATIVE_AI_API_KEY=${GOOGLE_GENERATIVE_AI_API_KEY}
- OLLAMA_API_BASE_URL=${OLLAMA_API_BASE_URL} - OLLAMA_API_BASE_URL=${OLLAMA_API_BASE_URL}
- VITE_LOG_LEVEL=${VITE_LOG_LEVEL:-debug} - VITE_LOG_LEVEL=${VITE_LOG_LEVEL:-debug}
- DEFAULT_NUM_CTX=${DEFAULT_NUM_CTX:-32768}
- RUNNING_IN_DOCKER=true - RUNNING_IN_DOCKER=true
extra_hosts: extra_hosts:
- "host.docker.internal:host-gateway" - "host.docker.internal:host-gateway"
@ -48,6 +49,7 @@ services:
- GOOGLE_GENERATIVE_AI_API_KEY=${GOOGLE_GENERATIVE_AI_API_KEY} - GOOGLE_GENERATIVE_AI_API_KEY=${GOOGLE_GENERATIVE_AI_API_KEY}
- OLLAMA_API_BASE_URL=${OLLAMA_API_BASE_URL} - OLLAMA_API_BASE_URL=${OLLAMA_API_BASE_URL}
- VITE_LOG_LEVEL=${VITE_LOG_LEVEL:-debug} - VITE_LOG_LEVEL=${VITE_LOG_LEVEL:-debug}
- DEFAULT_NUM_CTX=${DEFAULT_NUM_CTX:-32768}
- RUNNING_IN_DOCKER=true - RUNNING_IN_DOCKER=true
extra_hosts: extra_hosts:
- "host.docker.internal:host-gateway" - "host.docker.internal:host-gateway"

View File

@ -27,6 +27,7 @@
}, },
"dependencies": { "dependencies": {
"@ai-sdk/anthropic": "^0.0.39", "@ai-sdk/anthropic": "^0.0.39",
"@ai-sdk/cohere": "^1.0.1",
"@ai-sdk/google": "^0.0.52", "@ai-sdk/google": "^0.0.52",
"@ai-sdk/mistral": "^0.0.43", "@ai-sdk/mistral": "^0.0.43",
"@ai-sdk/openai": "^0.0.66", "@ai-sdk/openai": "^0.0.66",

47
pnpm-lock.yaml generated
View File

@ -14,6 +14,9 @@ importers:
'@ai-sdk/anthropic': '@ai-sdk/anthropic':
specifier: ^0.0.39 specifier: ^0.0.39
version: 0.0.39(zod@3.23.8) version: 0.0.39(zod@3.23.8)
'@ai-sdk/cohere':
specifier: ^1.0.1
version: 1.0.1(zod@3.23.8)
'@ai-sdk/google': '@ai-sdk/google':
specifier: ^0.0.52 specifier: ^0.0.52
version: 0.0.52(zod@3.23.8) version: 0.0.52(zod@3.23.8)
@ -279,6 +282,12 @@ packages:
peerDependencies: peerDependencies:
zod: ^3.0.0 zod: ^3.0.0
'@ai-sdk/cohere@1.0.1':
resolution: {integrity: sha512-xLaSYl/hs9EqfpvT9PvqZrDWjJPQPZBd0iT32T6812vN6kwuEQ6sSgQvqHWczIqxeej2GNRgMQwDL6Lh0L5pZw==}
engines: {node: '>=18'}
peerDependencies:
zod: ^3.0.0
'@ai-sdk/google@0.0.52': '@ai-sdk/google@0.0.52':
resolution: {integrity: sha512-bfsA/1Ae0SQ6NfLwWKs5SU4MBwlzJjVhK6bTVBicYFjUxg9liK/W76P1Tq/qK9OlrODACz3i1STOIWsFPpIOuQ==} resolution: {integrity: sha512-bfsA/1Ae0SQ6NfLwWKs5SU4MBwlzJjVhK6bTVBicYFjUxg9liK/W76P1Tq/qK9OlrODACz3i1STOIWsFPpIOuQ==}
engines: {node: '>=18'} engines: {node: '>=18'}
@ -324,6 +333,15 @@ packages:
zod: zod:
optional: true optional: true
'@ai-sdk/provider-utils@2.0.1':
resolution: {integrity: sha512-TNg7rPhRtETB2Z9F0JpOvpGii9Fs8EWM8nYy1jEkvSXkrPJ6b/9zVnDdaJsmLFDyrMbOsPJlkblYtmYEQou36w==}
engines: {node: '>=18'}
peerDependencies:
zod: ^3.0.0
peerDependenciesMeta:
zod:
optional: true
'@ai-sdk/provider@0.0.12': '@ai-sdk/provider@0.0.12':
resolution: {integrity: sha512-oOwPQD8i2Ynpn22cur4sk26FW3mSy6t6/X/K1Ay2yGBKYiSpRyLfObhOrZEGsXDx+3euKy4nEZ193R36NM+tpQ==} resolution: {integrity: sha512-oOwPQD8i2Ynpn22cur4sk26FW3mSy6t6/X/K1Ay2yGBKYiSpRyLfObhOrZEGsXDx+3euKy4nEZ193R36NM+tpQ==}
engines: {node: '>=18'} engines: {node: '>=18'}
@ -336,6 +354,10 @@ packages:
resolution: {integrity: sha512-XMsNGJdGO+L0cxhhegtqZ8+T6nn4EoShS819OvCgI2kLbYTIvk0GWFGD0AXJmxkxs3DrpsJxKAFukFR7bvTkgQ==} resolution: {integrity: sha512-XMsNGJdGO+L0cxhhegtqZ8+T6nn4EoShS819OvCgI2kLbYTIvk0GWFGD0AXJmxkxs3DrpsJxKAFukFR7bvTkgQ==}
engines: {node: '>=18'} engines: {node: '>=18'}
'@ai-sdk/provider@1.0.0':
resolution: {integrity: sha512-Sj29AzooJ7SYvhPd+AAWt/E7j63E9+AzRnoMHUaJPRYzOd/WDrVNxxv85prF9gDcQ7XPVlSk9j6oAZV9/DXYpA==}
engines: {node: '>=18'}
'@ai-sdk/react@0.0.62': '@ai-sdk/react@0.0.62':
resolution: {integrity: sha512-1asDpxgmeHWL0/EZPCLENxfOHT+0jce0z/zasRhascodm2S6f6/KZn5doLG9jdmarcb+GjMjFmmwyOVXz3W1xg==} resolution: {integrity: sha512-1asDpxgmeHWL0/EZPCLENxfOHT+0jce0z/zasRhascodm2S6f6/KZn5doLG9jdmarcb+GjMjFmmwyOVXz3W1xg==}
engines: {node: '>=18'} engines: {node: '>=18'}
@ -3033,6 +3055,10 @@ packages:
resolution: {integrity: sha512-v0eOBUbiaFojBu2s2NPBfYUoRR9GjcDNvCXVaqEf5vVfpIAh9f8RCo4vXTP8c63QRKCFwoLpMpTdPwwhEKVgzA==} resolution: {integrity: sha512-v0eOBUbiaFojBu2s2NPBfYUoRR9GjcDNvCXVaqEf5vVfpIAh9f8RCo4vXTP8c63QRKCFwoLpMpTdPwwhEKVgzA==}
engines: {node: '>=14.18'} engines: {node: '>=14.18'}
eventsource-parser@3.0.0:
resolution: {integrity: sha512-T1C0XCUimhxVQzW4zFipdx0SficT651NnkR0ZSH3yQwh+mFMdLfgjABVi4YtMTtaL4s168593DaoaRLMqryavA==}
engines: {node: '>=18.0.0'}
evp_bytestokey@1.0.3: evp_bytestokey@1.0.3:
resolution: {integrity: sha512-/f2Go4TognH/KvCISP7OUsHn85hT9nUkxxA9BEWxFn+Oj9o8ZNLm/40hdlgSLyuOimsrTKLUMEorQexp/aPQeA==} resolution: {integrity: sha512-/f2Go4TognH/KvCISP7OUsHn85hT9nUkxxA9BEWxFn+Oj9o8ZNLm/40hdlgSLyuOimsrTKLUMEorQexp/aPQeA==}
@ -5687,6 +5713,12 @@ snapshots:
'@ai-sdk/provider-utils': 1.0.9(zod@3.23.8) '@ai-sdk/provider-utils': 1.0.9(zod@3.23.8)
zod: 3.23.8 zod: 3.23.8
'@ai-sdk/cohere@1.0.1(zod@3.23.8)':
dependencies:
'@ai-sdk/provider': 1.0.0
'@ai-sdk/provider-utils': 2.0.1(zod@3.23.8)
zod: 3.23.8
'@ai-sdk/google@0.0.52(zod@3.23.8)': '@ai-sdk/google@0.0.52(zod@3.23.8)':
dependencies: dependencies:
'@ai-sdk/provider': 0.0.24 '@ai-sdk/provider': 0.0.24
@ -5733,6 +5765,15 @@ snapshots:
optionalDependencies: optionalDependencies:
zod: 3.23.8 zod: 3.23.8
'@ai-sdk/provider-utils@2.0.1(zod@3.23.8)':
dependencies:
'@ai-sdk/provider': 1.0.0
eventsource-parser: 3.0.0
nanoid: 3.3.7
secure-json-parse: 2.7.0
optionalDependencies:
zod: 3.23.8
'@ai-sdk/provider@0.0.12': '@ai-sdk/provider@0.0.12':
dependencies: dependencies:
json-schema: 0.4.0 json-schema: 0.4.0
@ -5745,6 +5786,10 @@ snapshots:
dependencies: dependencies:
json-schema: 0.4.0 json-schema: 0.4.0
'@ai-sdk/provider@1.0.0':
dependencies:
json-schema: 0.4.0
'@ai-sdk/react@0.0.62(react@18.3.1)(zod@3.23.8)': '@ai-sdk/react@0.0.62(react@18.3.1)(zod@3.23.8)':
dependencies: dependencies:
'@ai-sdk/provider-utils': 1.0.20(zod@3.23.8) '@ai-sdk/provider-utils': 1.0.20(zod@3.23.8)
@ -8751,6 +8796,8 @@ snapshots:
eventsource-parser@1.1.2: {} eventsource-parser@1.1.2: {}
eventsource-parser@3.0.0: {}
evp_bytestokey@1.0.3: evp_bytestokey@1.0.3:
dependencies: dependencies:
md5.js: 1.3.5 md5.js: 1.3.5