feat: increase icon size and fix linting

This commit is contained in:
Harivansh Rathi 2024-12-14 10:33:28 -05:00
parent eef7a6557f
commit 62b995490e
21 changed files with 166 additions and 139 deletions

View file

@ -1,4 +1,4 @@
name: "Bug report"
name: 'Bug report'
description: Create a report to help us improve
body:
- type: markdown

View file

@ -14,12 +14,12 @@ jobs:
uses: actions/stale@v8
with:
repo-token: ${{ secrets.GITHUB_TOKEN }}
stale-issue-message: "This issue has been marked as stale due to inactivity. If no further activity occurs, it will be closed in 7 days."
stale-pr-message: "This pull request has been marked as stale due to inactivity. If no further activity occurs, it will be closed in 7 days."
stale-issue-message: 'This issue has been marked as stale due to inactivity. If no further activity occurs, it will be closed in 7 days.'
stale-pr-message: 'This pull request has been marked as stale due to inactivity. If no further activity occurs, it will be closed in 7 days.'
days-before-stale: 10 # Number of days before marking an issue or PR as stale
days-before-close: 4 # Number of days after being marked stale before closing
stale-issue-label: "stale" # Label to apply to stale issues
stale-pr-label: "stale" # Label to apply to stale pull requests
exempt-issue-labels: "pinned,important" # Issues with these labels won't be marked stale
exempt-pr-labels: "pinned,important" # PRs with these labels won't be marked stale
stale-issue-label: 'stale' # Label to apply to stale issues
stale-pr-label: 'stale' # Label to apply to stale pull requests
exempt-issue-labels: 'pinned,important' # Issues with these labels won't be marked stale
exempt-pr-labels: 'pinned,important' # PRs with these labels won't be marked stale
operations-per-run: 75 # Limits the number of actions per run to avoid API rate limits

View file

@ -3,6 +3,7 @@
First off, thank you for considering contributing to Bolt.diy! This fork aims to expand the capabilities of the original project by integrating multiple LLM providers and enhancing functionality. Every contribution helps make Bolt.diy a better tool for developers worldwide.
## 📋 Table of Contents
- [Code of Conduct](#code-of-conduct)
- [How Can I Contribute?](#how-can-i-contribute)
- [Pull Request Guidelines](#pull-request-guidelines)
@ -18,29 +19,34 @@ This project and everyone participating in it is governed by our Code of Conduct
## How Can I Contribute?
### 🐞 Reporting Bugs and Feature Requests
- Check the issue tracker to avoid duplicates
- Use the issue templates when available
- Include as much relevant information as possible
- For bugs, add steps to reproduce the issue
### 🔧 Code Contributions
1. Fork the repository
2. Create a new branch for your feature/fix
3. Write your code
4. Submit a pull request
### ✨ Becoming a Core Contributor
We're looking for dedicated contributors to help maintain and grow this project. If you're interested in becoming a core contributor, please fill out our [Contributor Application Form](https://forms.gle/TBSteXSDCtBDwr5m7).
## Pull Request Guidelines
### 📝 PR Checklist
- [ ] Branch from the main branch
- [ ] Update documentation if needed
- [ ] Manually verify all new functionality works as expected
- [ ] Keep PRs focused and atomic
### 👀 Review Process
1. Manually test the changes
2. At least one maintainer review required
3. Address all review comments
@ -49,6 +55,7 @@ We're looking for dedicated contributors to help maintain and grow this project.
## Coding Standards
### 💻 General Guidelines
- Follow existing code style
- Comment complex logic
- Keep functions focused and small
@ -59,12 +66,15 @@ so set up your IDE to do that for you!
## Development Setup
### 🔄 Initial Setup
1. Clone the repository:
```bash
git clone https://github.com/coleam00/bolt.new-any-llm.git
```
2. Install dependencies:
```bash
pnpm install
```
@ -72,6 +82,7 @@ pnpm install
3. Set up environment variables:
- Rename `.env.example` to `.env.local`
- Add your LLM API keys (only set the ones you plan to use):
```bash
GROQ_API_KEY=XXX
HuggingFace_API_KEY=XXX
@ -79,26 +90,30 @@ OPENAI_API_KEY=XXX
ANTHROPIC_API_KEY=XXX
...
```
- Optionally set debug level:
```bash
VITE_LOG_LEVEL=debug
```
- Optionally set context size:
```bash
DEFAULT_NUM_CTX=32768
```
Some Example Context Values for the qwen2.5-coder:32b models are.
* DEFAULT_NUM_CTX=32768 - Consumes 36GB of VRAM
* DEFAULT_NUM_CTX=24576 - Consumes 32GB of VRAM
* DEFAULT_NUM_CTX=12288 - Consumes 26GB of VRAM
* DEFAULT_NUM_CTX=6144 - Consumes 24GB of VRAM
- DEFAULT_NUM_CTX=32768 - Consumes 36GB of VRAM
- DEFAULT_NUM_CTX=24576 - Consumes 32GB of VRAM
- DEFAULT_NUM_CTX=12288 - Consumes 26GB of VRAM
- DEFAULT_NUM_CTX=6144 - Consumes 24GB of VRAM
**Important**: Never commit your `.env.local` file to version control. It's already included in .gitignore.
### 🚀 Running the Development Server
```bash
pnpm run dev
```
@ -205,6 +220,7 @@ The `docker-compose.yaml` configuration is compatible with VS Code dev container
## Environment Files
Ensure you have the appropriate `.env.local` file configured before running the containers. This file should contain:
- API keys
- Environment-specific configurations
- Other required environment variables

View file

@ -20,6 +20,7 @@ Welcome to asap.it, an AI-powered full-stack web development platform that lets
2. Clone the repository
3. Rename `.env.example` to `.env.local` and add your LLM API keys
4. Install dependencies:
```bash
pnpm install
```
@ -27,6 +28,7 @@ pnpm install
## Development
Run the development server:
```bash
pnpm run dev
```
@ -34,6 +36,7 @@ pnpm run dev
## Docker Support
Build and run with Docker:
```bash
# Development
npm run dockerbuild
@ -47,6 +50,7 @@ docker-compose --profile production up
## Environment Variables
Required (set only the ones you plan to use):
```
GROQ_API_KEY=XXX
OPENAI_API_KEY=XXX
@ -54,7 +58,9 @@ ANTHROPIC_API_KEY=XXX
```
Optional:
```
VITE_LOG_LEVEL=debug
OLLAMA_API_BASE_URL=http://localhost:11434
DEFAULT_NUM_CTX=8192
```

View file

@ -284,14 +284,12 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
<div className={classNames(styles.Chat, 'flex flex-col flex-grow lg:min-w-[var(--chat-min-width)] h-full')}>
{!chatStarted && (
<div id="intro" className="mt-[26vh] max-w-chat mx-auto text-center px-4 lg:px-0">
<div className="flex flex-col items-center justify-center">
<div className="w-16 h-16 mb-4">
<img src="/favicon.svg" alt="Asap.it Logo" className="w-full h-full" />
</div>
<h1 className="text-3xl lg:text-6xl font-bold text-bolt-elements-textPrimary mb-4 animate-fade-in">
Lets asap it
Let's asap it
</h1>
<p className="text-md lg:text-xl mb-8 text-bolt-elements-textSecondary animate-fade-in animation-delay-200">
Bring ideas to life in seconds or get help on existing projects.

View file

@ -112,11 +112,7 @@
--cm-searchMatch-backgroundColor: var(--bolt-elements-editor-searchMatch-backgroundColor, rgba(234, 92, 0, 0.33));
&[data-selected='true'] {
background: linear-gradient(
to right,
theme('colors.alpha.gray.30'),
transparent
);
background: linear-gradient(to right, theme('colors.alpha.gray.30'), transparent);
}
}

View file

@ -46,7 +46,6 @@
- search chats
- Connections Tabs
#### 🐛 Bug Fixes
- buttons after switching to tailwind-compat reset
@ -105,7 +104,6 @@
- re-capitalize "NEW"
- dev command
#### 📚 Documentation
- fix typo in CONTRIBUTING.md (#158)
@ -114,21 +112,18 @@
- add link to bolt.new issue tracker
- added socials
#### ♻️ Code Refactoring
- workbench store and move logic into action runner (#4)
- improve history item hover states and interactions
- settinge menu refactored with useSettings hook
#### ⚙️ CI
- use correct versions (#2)
- deploy to cloudflare (#19)
- remove deployment workflow
#### 🔧 Chores
- make sure that husky hooks are executed
@ -209,7 +204,6 @@
- adding workflow
- update commit hash to 6cb536a9a32e04b4ebc1f3788d6fae06c5bce5ac
#### 🔍 Other Changes
- add file tree and hook up editor
@ -804,5 +798,3 @@
- Merge pull request #697 from thecodacus/update-socials
- Merge pull request #701 from thecodacus/auto-versioning #release
- skipping commit version

View file

@ -6,8 +6,8 @@ services:
dockerfile: Dockerfile
target: bolt-ai-production
ports:
- "5173:5173"
env_file: ".env.local"
- '5173:5173'
env_file: '.env.local'
environment:
- NODE_ENV=production
- COMPOSE_PROFILES=production
@ -26,7 +26,7 @@ services:
- DEFAULT_NUM_CTX=${DEFAULT_NUM_CTX:-32768}
- RUNNING_IN_DOCKER=true
extra_hosts:
- "host.docker.internal:host-gateway"
- 'host.docker.internal:host-gateway'
command: pnpm run dockerstart
profiles:
- production
@ -56,7 +56,7 @@ services:
- DEFAULT_NUM_CTX=${DEFAULT_NUM_CTX:-32768}
- RUNNING_IN_DOCKER=true
extra_hosts:
- "host.docker.internal:host-gateway"
- 'host.docker.internal:host-gateway'
volumes:
- type: bind
source: .
@ -64,6 +64,6 @@ services:
consistency: cached
- /app/node_modules
ports:
- "5173:5173"
- '5173:5173'
command: pnpm run dev --host 0.0.0.0
profiles: ["development", "default"]
profiles: ['development', 'default']

View file

@ -7,6 +7,7 @@ The `DEFAULT_NUM_CTX` environment variable can be used to limit the maximum numb
First off, thank you for considering contributing to Bolt.diy! This fork aims to expand the capabilities of the original project by integrating multiple LLM providers and enhancing functionality. Every contribution helps make Bolt.diy a better tool for developers worldwide.
## 📋 Table of Contents
- [Code of Conduct](#code-of-conduct)
- [How Can I Contribute?](#how-can-i-contribute)
- [Pull Request Guidelines](#pull-request-guidelines)
@ -21,29 +22,34 @@ This project and everyone participating in it is governed by our Code of Conduct
## How Can I Contribute?
### 🐞 Reporting Bugs and Feature Requests
- Check the issue tracker to avoid duplicates
- Use the issue templates when available
- Include as much relevant information as possible
- For bugs, add steps to reproduce the issue
### 🔧 Code Contributions
1. Fork the repository
2. Create a new branch for your feature/fix
3. Write your code
4. Submit a pull request
### ✨ Becoming a Core Contributor
We're looking for dedicated contributors to help maintain and grow this project. If you're interested in becoming a core contributor, please fill out our [Contributor Application Form](https://forms.gle/TBSteXSDCtBDwr5m7).
## Pull Request Guidelines
### 📝 PR Checklist
- [ ] Branch from the main branch
- [ ] Update documentation if needed
- [ ] Manually verify all new functionality works as expected
- [ ] Keep PRs focused and atomic
### 👀 Review Process
1. Manually test the changes
2. At least one maintainer review required
3. Address all review comments
@ -52,6 +58,7 @@ We're looking for dedicated contributors to help maintain and grow this project.
## Coding Standards
### 💻 General Guidelines
- Follow existing code style
- Comment complex logic
- Keep functions focused and small
@ -60,12 +67,15 @@ We're looking for dedicated contributors to help maintain and grow this project.
## Development Setup
### 🔄 Initial Setup
1. Clone the repository:
```bash
git clone https://github.com/stackblitz-labs/bolt.diy.git
```
2. Install dependencies:
```bash
pnpm install
```
@ -73,6 +83,7 @@ pnpm install
3. Set up environment variables:
- Rename `.env.example` to `.env.local`
- Add your LLM API keys (only set the ones you plan to use):
```bash
GROQ_API_KEY=XXX
HuggingFace_API_KEY=XXX
@ -80,26 +91,30 @@ OPENAI_API_KEY=XXX
ANTHROPIC_API_KEY=XXX
...
```
- Optionally set debug level:
```bash
VITE_LOG_LEVEL=debug
```
- Optionally set context size:
```bash
DEFAULT_NUM_CTX=32768
```
Some Example Context Values for the qwen2.5-coder:32b models are.
* DEFAULT_NUM_CTX=32768 - Consumes 36GB of VRAM
* DEFAULT_NUM_CTX=24576 - Consumes 32GB of VRAM
* DEFAULT_NUM_CTX=12288 - Consumes 26GB of VRAM
* DEFAULT_NUM_CTX=6144 - Consumes 24GB of VRAM
- DEFAULT_NUM_CTX=32768 - Consumes 36GB of VRAM
- DEFAULT_NUM_CTX=24576 - Consumes 32GB of VRAM
- DEFAULT_NUM_CTX=12288 - Consumes 26GB of VRAM
- DEFAULT_NUM_CTX=6144 - Consumes 24GB of VRAM
**Important**: Never commit your `.env.local` file to version control. It's already included in .gitignore.
### 🚀 Running the Development Server
```bash
pnpm run dev
```
@ -206,6 +221,7 @@ The `docker-compose.yaml` configuration is compatible with VS Code dev container
## Environment Files
Ensure you have the appropriate `.env.local` file configured before running the containers. This file should contain:
- API keys
- Environment-specific configurations
- Other required environment variables

View file

@ -6,14 +6,14 @@
Mention the frameworks or libraries you want to use (e.g., Astro, Tailwind, ShadCN) in your initial prompt. This ensures that Bolt.diy scaffolds the project according to your preferences.
- **Use the enhance prompt icon**:
Before sending your prompt, click the *enhance* icon to let the AI refine your prompt. You can edit the suggested improvements before submitting.
Before sending your prompt, click the _enhance_ icon to let the AI refine your prompt. You can edit the suggested improvements before submitting.
- **Scaffold the basics first, then add features**:
Ensure the foundational structure of your application is in place before introducing advanced functionality. This helps Bolt.diy establish a solid base to build on.
- **Batch simple instructions**:
Combine simple tasks into a single prompt to save time and reduce API credit consumption. For example:
*"Change the color scheme, add mobile responsiveness, and restart the dev server."*
_"Change the color scheme, add mobile responsiveness, and restart the dev server."_
---
@ -23,7 +23,6 @@ Check out our [Contribution Guide](CONTRIBUTING.md) for more details on how to g
---
## What are the future plans for Bolt.diy?
Visit our [Roadmap](https://roadmap.sh/r/ottodev-roadmap-2ovzo) for the latest updates.
@ -48,27 +47,33 @@ While local LLMs are improving rapidly, larger models like GPT-4o, Claude 3.5 So
## Common Errors and Troubleshooting
### **"There was an error processing this request"**
This generic error message means something went wrong. Check both:
- The terminal (if you started the app with Docker or `pnpm`).
- The developer console in your browser (press `F12` or right-click > *Inspect*, then go to the *Console* tab).
- The developer console in your browser (press `F12` or right-click > _Inspect_, then go to the _Console_ tab).
---
### **"x-api-key header missing"**
This error is sometimes resolved by restarting the Docker container.
If that doesnt work, try switching from Docker to `pnpm` or vice versa. Were actively investigating this issue.
---
### **Blank preview when running the app**
A blank preview often occurs due to hallucinated bad code or incorrect commands.
To troubleshoot:
- Check the developer console for errors.
- Remember, previews are core functionality, so the app isnt broken! Were working on making these errors more transparent.
---
### **"Everything works, but the results are bad"**
Local LLMs like Qwen-2.5-Coder are powerful for small applications but still experimental for larger projects. For better results, consider using larger models like GPT-4o, Claude 3.5 Sonnet, or DeepSeek Coder V2 236b.
---

View file

@ -1,4 +1,5 @@
# Welcome to Bolt DIY
Bolt.diy allows you to choose the LLM that you use for each prompt! Currently, you can use OpenAI, Anthropic, Ollama, OpenRouter, Gemini, LMStudio, Mistral, xAI, HuggingFace, DeepSeek, or Groq models - and it is easily extended to use any other model supported by the Vercel AI SDK! See the instructions below for running this locally and extending it to include more models.
Join the community!
@ -14,6 +15,7 @@ Bolt.diy is an AI-powered web development agent that allows you to prompt, run,
Claude, v0, etc are incredible- but you can't install packages, run backends, or edit code. Thats where Bolt.diy stands out:
- **Full-Stack in the Browser**: Bolt.diy integrates cutting-edge AI models with an in-browser development environment powered by **StackBlitzs WebContainers**. This allows you to:
- Install and run npm tools and libraries (like Vite, Next.js, and more)
- Run Node.js servers
- Interact with third-party APIs

View file

@ -33,7 +33,7 @@ theme:
# favicon: assets/logo.png
repo_name: Bolt.diy
repo_url: https://github.com/stackblitz-labs/bolt.diy
edit_uri: ""
edit_uri: ''
extra:
generator: false
@ -51,9 +51,6 @@ extra:
link: https://bsky.app/profile/bolt.diy
name: Bolt.diy on Bluesky
markdown_extensions:
- pymdownx.highlight:
anchor_linenums: true

View file

@ -4,13 +4,7 @@ import { getNamingConventionRule, tsFileExtensions } from '@blitz/eslint-plugin/
export default [
{
ignores: [
'**/dist',
'**/node_modules',
'**/.wrangler',
'**/bolt/build',
'**/.history',
],
ignores: ['**/dist', '**/node_modules', '**/.wrangler', '**/bolt/build', '**/.history'],
},
...blitzPlugin.configs.recommended(),
{
@ -20,15 +14,15 @@ export default [
'@typescript-eslint/no-empty-object-type': 'off',
'@blitz/comment-syntax': 'off',
'@blitz/block-scope-case': 'off',
'array-bracket-spacing': ["error", "never"],
'object-curly-newline': ["error", { "consistent": true }],
'keyword-spacing': ["error", { "before": true, "after": true }],
'consistent-return': "error",
'semi': ["error", "always"],
'curly': ["error"],
'no-eval': ["error"],
'linebreak-style': ["error", "unix"],
'arrow-spacing': ["error", { "before": true, "after": true }]
'array-bracket-spacing': ['error', 'never'],
'object-curly-newline': ['error', { consistent: true }],
'keyword-spacing': ['error', { before: true, after: true }],
'consistent-return': 'error',
semi: ['error', 'always'],
curly: ['error'],
'no-eval': ['error'],
'linebreak-style': ['error', 'unix'],
'arrow-spacing': ['error', { before: true, after: true }],
},
},
{
@ -53,7 +47,7 @@ export default [
patterns: [
{
group: ['../'],
message: 'Relative imports are not allowed. Please use \'~/\' instead.',
message: "Relative imports are not allowed. Please use '~/' instead.",
},
],
},

View file

@ -1,7 +1,12 @@
{
"compilerOptions": {
"lib": ["DOM", "DOM.Iterable", "ESNext"],
"types": ["@remix-run/cloudflare", "vite/client", "@cloudflare/workers-types/2023-07-01", "@types/dom-speech-recognition"],
"types": [
"@remix-run/cloudflare",
"vite/client",
"@cloudflare/workers-types/2023-07-01",
"@types/dom-speech-recognition"
],
"isolatedModules": true,
"esModuleInterop": true,
"jsx": "react-jsx",

View file

@ -19,7 +19,7 @@ export default defineConfig((config) => {
future: {
v3_fetcherPersist: true,
v3_relativeSplatPath: true,
v3_throwAbortReason: true
v3_throwAbortReason: true,
},
}),
UnoCSS(),
@ -27,7 +27,7 @@ export default defineConfig((config) => {
chrome129IssuePlugin(),
config.mode === 'production' && optimizeCssModules({ apply: 'build' }),
],
envPrefix: ["VITE_", "OPENAI_LIKE_API_", "OLLAMA_API_BASE_URL", "LMSTUDIO_API_BASE_URL","TOGETHER_API_BASE_URL"],
envPrefix: ['VITE_', 'OPENAI_LIKE_API_', 'OLLAMA_API_BASE_URL', 'LMSTUDIO_API_BASE_URL', 'TOGETHER_API_BASE_URL'],
css: {
preprocessorOptions: {
scss: {