# Introduction

> **194 practical guides** for deploying AI models, GPU workloads, and AI platforms on [Clore.ai](https://clore.ai) — the decentralized GPU rental marketplace.

{% hint style="success" %}
All examples can be run on GPU servers rented through the [Clore.ai Marketplace](https://clore.ai/marketplace). Rent powerful GPUs starting from **$0.15/day**.
{% endhint %}

## What is Clore.ai?

[Clore.ai](https://clore.ai) is a peer-to-peer GPU marketplace where you rent GPUs directly from other people — like Airbnb for compute. Thousands of GPUs are available 24/7, from budget RTX 3060s to enterprise H100s. Pay with **CLORE**, **BTC**, **USDT**, or **USDC**.

### Why Clore.ai for AI?

* **Affordable** — RTX 4090 from $0.50/day (vs $2–4 on cloud providers)
* **No commitments** — rent by the hour, no contracts
* **Full root access** — Docker containers with GPU passthrough
* **Wide GPU selection** — 3,400+ machines, 12,800+ GPUs online
* **Pay your way** — crypto payments (CLORE, BTC, USDT/USDC)

## 📚 Guide Categories

| Category                                                                                      | Guides | Highlights                                                                                                                                                                              |
| --------------------------------------------------------------------------------------------- | ------ | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| 🤖 [Language Models](https://docs.clore.ai/guides/language-models/language-models)            | **45** | DeepSeek V4, MiMo-V2.5-Pro, Hy3 Preview, Qwen3.6-27B, MiniMax M2.7, Ling-2.6-flash, GLM-5.1, Llama 4, Gemma 3, Qwen3.5-Omni, GLM-5, DeepSeek-R1, Nemotron 3 Super, Ollama, vLLM, SGLang |
| 🤖 [AI Platforms & Agents](https://docs.clore.ai/guides/ai-platforms-and-agents/ai-platforms) | **18** | Dify, CrewAI, AutoGPT, OpenHands, MetaGPT, n8n, LibreChat, Open Interpreter, SuperAGI, SWE-agent                                                                                        |
| 🔊 [Audio & Voice](https://docs.clore.ai/guides/audio-and-voice/audio-voice)                  | **23** | MOSS-TTS, Voxtral TTS, Whisper, Qwen3-TTS, MiniMax Speech 2.6, Dia, ChatTTS, Kokoro, Fish Speech, MeloTTS, StyleTTS2                                                                    |
| 🎬 [Video Generation](https://docs.clore.ai/guides/video-generation/video-generation)         | **14** | Wan2.1, Wan 2.2 VBVR, FramePack, CubeComposer 4K 360°, LTX-2, CogVideoX, SkyReels, HunyuanVideo, Mochi-1, AnimateDiff                                                                   |
| 🎨 [Image Generation](https://docs.clore.ai/guides/image-generation/image-generation)         | **11** | FLUX.2 Klein, HunyuanImage 3.0, SD 3.5, ComfyUI, InvokeAI, SD WebUI Forge                                                                                                               |
| 🧠 [Training](https://docs.clore.ai/guides/training/training)                                 | **12** | Unsloth, Axolotl, LoRA, DreamBooth, DeepSpeed, LLaMA-Factory, TRL, LitGPT, Mergekit                                                                                                     |
| 🏁 [Getting Started](https://docs.clore.ai/guides/getting-started/getting-started)            | **8**  | GPU comparison, pricing, FAQ, troubleshooting                                                                                                                                           |
| 👁️ [Vision Models](https://docs.clore.ai/guides/vision-models/vision-models)                 | **6**  | Qwen2.5-VL, SAM2, LLaVA, Florence-2                                                                                                                                                     |
| 🖼️ [Image Processing](https://docs.clore.ai/guides/image-processing/image-processing)        | **6**  | Real-ESRGAN, ControlNet, Depth Anything, ICLight                                                                                                                                        |
| 🧊 [3D Generation](https://docs.clore.ai/guides/3d-generation/3d-generation)                  | **6**  | Hunyuan World 2.0, TRELLIS, Hunyuan3D 2.1, TripoSR, Gaussian Splatting, Nerfstudio                                                                                                      |
| 🎭 [Talking Heads](https://docs.clore.ai/guides/talking-heads/talking-heads)                  | **3**  | LivePortrait, SadTalker, Wav2Lip                                                                                                                                                        |
| 👤 [Face & Identity](https://docs.clore.ai/guides/face-and-identity/face-identity)            | **3**  | FaceFusion, InstantID, IP-Adapter                                                                                                                                                       |
| ⚙️ [Advanced](https://docs.clore.ai/guides/advanced/advanced)                                 | **5**  | Multi-GPU, API integration, batch processing                                                                                                                                            |
| 🔧 [Other Workloads](https://docs.clore.ai/guides/other-workloads/other-workloads)            | **3**  | Blender, Kandinsky, OpenClaw                                                                                                                                                            |
| 💻 [AI Coding Tools](https://docs.clore.ai/guides/ai-coding-tools/ai-coding)                  | **2**  | Aider, TabbyML (self-hosted Copilot)                                                                                                                                                    |
| 📊 [Comparisons](https://docs.clore.ai/guides/comparisons/comparisons)                        | **7**  | LLM serving, fine-tuning, video gen, TTS, RAG frameworks, vector DBs                                                                                                                    |
| 📹 [Video Processing](https://docs.clore.ai/guides/video-processing/video-processing)         | **2**  | FFmpeg NVENC, RIFE interpolation                                                                                                                                                        |
| 🎵 [Music Generation](https://docs.clore.ai/guides/music-generation/music-generation)         | **1**  | ACE-Step (open-source Suno alternative)                                                                                                                                                 |
| 🔍 [Computer Vision](https://docs.clore.ai/guides/computer-vision/computer-vision)            | **2**  | YOLOv9/v10 detection                                                                                                                                                                    |
| 🗄️ [RAG / Vector DBs](https://docs.clore.ai/guides/rag-and-vector-databases/rag-vectordb)    | **6**  | LlamaIndex, RAGFlow, ChromaDB, Qdrant, Milvus, Weaviate                                                                                                                                 |
| 🔄 [MLOps](https://docs.clore.ai/guides/mlops-and-deployment/mlops)                           | **4**  | MLflow, Triton Inference Server, BentoML, ClearML                                                                                                                                       |
| ⚡ [DevOps GPU](https://docs.clore.ai/guides/gpu-devops/devops-gpu)                            | **2**  | TensorRT-LLM, ONNX Runtime                                                                                                                                                              |
| 🔬 [Science](https://docs.clore.ai/guides/science-and-research/science)                       | **3**  | AlphaFold2, ESMFold, GROMACS molecular dynamics                                                                                                                                         |
| 🎮 [Gaming / Streaming](https://docs.clore.ai/guides/gaming-and-streaming/gaming-streaming)   | **1**  | Sunshine + Moonlight remote gaming                                                                                                                                                      |
| ₿ [Crypto Mining](https://docs.clore.ai/guides/crypto-and-mining/crypto-mining)               | **1**  | XMRig CPU/GPU mining                                                                                                                                                                    |

## 🔥 What's New (April 2026)

### Week of April 27, 2026 — 6 New Guides

* 🤖 **DeepSeek V4** 🆕 — The frontier MoE finally shipped April 22 with full open weights under MIT. Pro: 1.6T total / 49B active, 1M context. Flash: 284B / 13B active, runs quantized on a single 80GB GPU. Day-0 vLLM and SGLang support — [guide](https://docs.clore.ai/guides/language-models/deepseek-v4)
* 🤖 **MiMo-V2.5-Pro** 🆕 — Xiaomi's first open-weight Pro tier (April 27). 1.02T MoE / 42B active, 1M context, MIT, FP8 native. Hybrid attention 6:1 + 3-step MTP — [guide](https://docs.clore.ai/guides/language-models/mimo-v25-pro)
* 🤖 **Hy3 Preview** 🆕 — Tencent Hunyuan 3 (April 23). 295B MoE / 21B active, 256K context, agent + reasoning focus. Single-node 4× A100 80GB serves it FP8 — [guide](https://docs.clore.ai/guides/language-models/hy3-preview)
* 🤖 **Ling-2.6-flash** 🆕 — Ant Group inclusionAI (April 28). 104B MoE / 7.4B active, agent-tuned. Fits on a single RTX 4090 INT4 or single H100 FP8 — [guide](https://docs.clore.ai/guides/language-models/ling-26-flash)
* 🤖 **Qwen3.6-27B** 🆕 — Alibaba's dense 27B (April 21). Apache 2.0, 262K context (extensible to 1M with YaRN). The "one card that just works" coding LLM — [guide](https://docs.clore.ai/guides/language-models/qwen36-27b)
* 🤖 **MiniMax M2.7** 🆕 — 229B MoE coding model (April 9). Custom MiniMax license, FP8 native. Previously listed as proprietary — open weights are now public — [guide](https://docs.clore.ai/guides/language-models/minimax-m27)

#### Industry Notes (April 21–29, 2026)

*Closed-source or API-only this window — self-host alternatives linked.*

* **Claude Opus 4.7** (Anthropic, April 16) — closed weights. → Self-host alternatives: [DeepSeek V4](https://docs.clore.ai/guides/language-models/deepseek-v4), [MiMo-V2.5-Pro](https://docs.clore.ai/guides/language-models/mimo-v25-pro), [GLM-5.1](https://docs.clore.ai/guides/language-models/glm-5-1)
* **GPT-5.5 / GPT-5.5 Pro** (OpenAI, April 23) — closed, API-only. → Self-host alternatives: [DeepSeek V4](https://docs.clore.ai/guides/language-models/deepseek-v4), [MiMo-V2.5-Pro](https://docs.clore.ai/guides/language-models/mimo-v25-pro)
* **Wan 2.7** (Alibaba) — open weights still pending Q2 2026. Wan 2.5 broke tradition and stayed closed; treat 2.7 with skepticism. → Self-host today: [Wan 2.2 VBVR](https://docs.clore.ai/guides/video-generation/wan22-vbvr), [Wan2.1](https://docs.clore.ai/guides/video-generation/wan-video), [LTX-2](https://docs.clore.ai/guides/video-generation/ltx-video-2)
* **Qwen3.6-Plus** (Alibaba) — 1M-context coding agent, API-only on OpenRouter. → Self-host alternative: [Qwen3.6-27B](https://docs.clore.ai/guides/language-models/qwen36-27b), [Qwen3.5-Omni](https://docs.clore.ai/guides/language-models/qwen35-omni)
* **Hugging Face ml-intern** (April 21) — open-source autonomous post-training agent (research → dataset → train). Watching adoption before adding a full guide. → Use today: [Unsloth](https://docs.clore.ai/guides/training/unsloth-finetune), [LLaMA-Factory](https://docs.clore.ai/guides/training/llama-factory), [Axolotl](https://docs.clore.ai/guides/training/axolotl-training)

### Week of April 20, 2026 — 3 New Guides

* 🧊 **Hunyuan World 2.0** 🆕 — Tencent's first open-source 3D world model. Text/image/video → editable mesh + 3D Gaussian Splatting for Unity/Unreal/Blender. \~1.2B params BF16, 12–24 GB VRAM — [guide](https://docs.clore.ai/guides/3d-generation/hunyuan-world-2)
* 🤖 **GLM-5.1** 🆕 — Zhipu AI's 744B MoE / 40B active coding powerhouse. 200K context, MIT license, #1 on SWE-Bench Pro (58.4%) beating Claude Opus 4.6 and GPT-5.4 — [guide](https://docs.clore.ai/guides/language-models/glm-5-1)
* 🔊 **MOSS-TTS** 🆕 — OpenMOSS ultra-lightweight 100M-param TTS. 48kHz stereo, 20 languages, CPU-only inference (no GPU required), zero-shot voice cloning, Apache 2.0 — [guide](https://docs.clore.ai/guides/audio-and-voice/moss-tts)

#### Industry Notes (April 13–20, 2026)

*Closed-source or API-only launches this week — not self-hostable on Clore yet. For each, we've linked the closest open-weight alternative you can run on Clore today.*

* **Alibaba Happy Oyster** — interactive real-time 3D world model (Roaming + Directing modes). Waitlisted / closed beta. → Self-host alternative: [Hunyuan World 2.0](https://docs.clore.ai/guides/3d-generation/hunyuan-world-2)
* **ByteDance Seedance 2.0 API** — fully opened April 14 on Volcano Engine, API-only. → Self-host alternatives: [Wan 2.2 VBVR](https://docs.clore.ai/guides/video-generation/wan22-vbvr), [LTX-2](https://docs.clore.ai/guides/video-generation/ltx-video-2)
* **Qwen 3.6-Plus** — Alibaba's proprietary 1M-context coding agent. API-only. → Self-host alternatives: [Qwen3.5-Omni](https://docs.clore.ai/guides/language-models/qwen35-omni), [GLM-5.1](https://docs.clore.ai/guides/language-models/glm-5-1)
* **Meta Muse Spark** — Meta's first post-Llama proprietary model. → Self-host alternative: [Llama 4](https://docs.clore.ai/guides/language-models/llama4)
* **Wan 2.7** — open weights still pending Q2 2026 (currently API-only via Together AI). → Self-host today: [Wan 2.2 VBVR](https://docs.clore.ai/guides/video-generation/wan22-vbvr)

### Week of April 13, 2026 — 2 New Guides

* 🤖 **Qwen3.5-Omni** 🆕 — Alibaba's unified multimodal model: text, audio, image & video understanding + text & speech generation in one 30B MoE. INT4 fits in 24GB VRAM. Apache 2.0 — [guide](https://docs.clore.ai/guides/language-models/qwen35-omni)
* 🎬 **Wan 2.2 VBVR** 🆕 — Video-Based Video Reference for consistent motion control. Use a reference clip to drive animation in new video generation. FP8 runs on RTX 3090 (16-24GB). ComfyUI workflow included — [guide](https://docs.clore.ai/guides/video-generation/wan22-vbvr)

#### Industry Notes (April 6–13, 2026)

*API-only or pending open weights — self-host alternatives linked.*

* **Wan 2.7** (Alibaba) — First+last-frame generation, video-to-video editing, subject referencing. Open weights expected mid-Q2 2026. → Self-host today: [Wan 2.2 VBVR](https://docs.clore.ai/guides/video-generation/wan22-vbvr), [Wan2.1](https://docs.clore.ai/guides/video-generation/wan-video)
* **Qwen3.6-Plus** (Alibaba) — 1M context window, agentic coding. Open-weight release pending. → Self-host alternatives: [Qwen3.5](https://docs.clore.ai/guides/language-models/qwen35), [GLM-5.1](https://docs.clore.ai/guides/language-models/glm-5-1)
* **LTX 2.3** — New LTX video generation model, 22B params, 32GB VRAM baseline. → Use current open release: [LTX-2](https://docs.clore.ai/guides/video-generation/ltx-video-2)

## 🔥 What's New (March 2026)

### Week of March 30, 2026 — 1 New Guide

* 🔊 **Voxtral TTS** 🆕 — Mistral's open-weight 4B TTS model, 9 languages, zero-shot voice cloning from 3s reference, only 3 GB VRAM, Apache 2.0 — [guide](https://docs.clore.ai/guides/audio-and-voice/voxtral-tts)

#### Industry Notes (March 23–30)

*API-only / shut down — self-host alternatives linked.*

* **Seedance 2.0** (ByteDance) — next-gen video gen launched in CapCut/Dreamina. API-only. → Self-host alternatives: [Wan2.1](https://docs.clore.ai/guides/video-generation/wan-video), [LTX-2](https://docs.clore.ai/guides/video-generation/ltx-video-2)
* **Google Lyria 3 Pro** — music generation via Gemini API/Vertex AI. API-only. → Self-host alternative: [ACE-Step](https://docs.clore.ai/guides/music-generation/ace-step)
* **Gemini 3 Deep Think** — Google's frontier reasoning model, API-only. → Self-host alternatives: [DeepSeek-R1](https://docs.clore.ai/guides/language-models/deepseek-r1), [GLM-5.1](https://docs.clore.ai/guides/language-models/glm-5-1)
* **Sora shutdown** — OpenAI officially shut down the Sora video generation app. → Self-host alternative: [OpenSora](https://docs.clore.ai/guides/video-generation/opensora)
* **DeepSeek V4** ✅ — shipped April 22, 2026 with full open weights (MIT). See [updated guide](https://docs.clore.ai/guides/language-models/deepseek-v4).

### Week of March 16, 2026 — 3 New Guides

* 🤖 **NVIDIA Nemotron 3 Super** 🆕 — 120B MoE / 12B active, 5× throughput, 1M context, Apache 2.0, built for agentic AI — [guide](https://docs.clore.ai/guides/language-models/nvidia-nemotron-3-super)
* 🌐 **Gemini 3.1 Flash Lite** 🆕 — Google's cheapest/fastest model (March 3, 2026), API + open-source alternatives — [guide](https://docs.clore.ai/guides/language-models/gemini-3-1-flash-lite)
* 🎬 **CubeComposer 4K 360° Video** 🆕 — first model to natively generate 4K 360° panoramic video from standard footage (CVPR 2026) — [guide](https://docs.clore.ai/guides/video-generation/cubecomposer-360-video)

### Also This Week (March 9–16, 2026)

* **GPT-5.4** — released March 5, 2026; native computer use (75.0% OSWorld), 1M context. API-only, no local deployment. → Self-host alternatives: [GLM-5.1](https://docs.clore.ai/guides/language-models/glm-5-1), [DeepSeek-R1](https://docs.clore.ai/guides/language-models/deepseek-r1)
* **DeepSeek V4** ✅ — released April 22, 2026, MIT, 1.6T-Pro / 284B-Flash. [Guide](https://docs.clore.ai/guides/language-models/deepseek-v4).
* **Wan 2.2** — new version of Wan video generation foundation model. → Self-host: [Wan 2.2 VBVR](https://docs.clore.ai/guides/video-generation/wan22-vbvr), [Wan2.1](https://docs.clore.ai/guides/video-generation/wan-video)

### New in March 2026 — 6 New Categories, 57 New Guides

* 🗄️ **RAG / Vector DBs** — 6 guides: [LlamaIndex](https://docs.clore.ai/guides/rag-and-vector-databases/llamaindex), [RAGFlow](https://docs.clore.ai/guides/rag-and-vector-databases/ragflow), [ChromaDB](https://docs.clore.ai/guides/rag-and-vector-databases/chromadb), [Qdrant](https://docs.clore.ai/guides/rag-and-vector-databases/qdrant), [Milvus](https://docs.clore.ai/guides/rag-and-vector-databases/milvus), [Weaviate](https://docs.clore.ai/guides/rag-and-vector-databases/weaviate)
* 🔄 **MLOps** — 4 guides: [MLflow](https://docs.clore.ai/guides/mlops-and-deployment/mlflow), [Triton Inference Server](https://docs.clore.ai/guides/mlops-and-deployment/triton-inference-server), [BentoML](https://docs.clore.ai/guides/mlops-and-deployment/bentoml), [ClearML](https://docs.clore.ai/guides/mlops-and-deployment/clearml)
* ⚡ **DevOps GPU** — 2 guides: [TensorRT-LLM](https://docs.clore.ai/guides/gpu-devops/tensorrt-llm), [ONNX Runtime](https://docs.clore.ai/guides/gpu-devops/onnx-runtime)
* 🔬 **Science** — 3 guides: [AlphaFold2](https://docs.clore.ai/guides/science-and-research/alphafold2), [ESMFold](https://docs.clore.ai/guides/science-and-research/esmfold), [GROMACS](https://docs.clore.ai/guides/science-and-research/gromacs) molecular dynamics
* 🎮 **Gaming / Streaming** — [Sunshine + Moonlight](https://docs.clore.ai/guides/gaming-and-streaming/sunshine-moonlight) GPU-accelerated remote gaming
* ₿ **Crypto Mining** — [XMRig](https://docs.clore.ai/guides/crypto-and-mining/xmrig) CPU/GPU mining on Clore.ai

### Latest Models Added (March 4, 2026)

* **DeepSeek V4** ✅ — 1.6T-Pro / 284B-Flash MoE, MIT license, **shipped April 22, 2026** — [guide](https://docs.clore.ai/guides/language-models/deepseek-v4)
* **MiniMax Speech 2.6** 🆕 — ultra-low latency TTS for voice agents, < 300ms TTFB — [guide](https://docs.clore.ai/guides/audio-and-voice/minimax-speech)
* **SGLang** 🆕 — RadixAttention for KV cache sharing, 2–5× throughput vs vLLM on MoE — [guide](https://docs.clore.ai/guides/language-models/sglang)
* **TGI** 🆕 — HuggingFace's production LLM serving with Flash Attention 2 + PagedAttention — [guide](https://docs.clore.ai/guides/language-models/tgi)
* **LLaMA-Factory** 🆕 — fine-tune 100+ LLMs with WebUI, LoRA/QLoRA, RLHF — [guide](https://docs.clore.ai/guides/training/llama-factory)
* **Fish Speech** 🆕 — zero-shot voice cloning in 8+ languages from 10–15s reference audio — [guide](https://docs.clore.ai/guides/audio-and-voice/fish-speech)
* **Mochi-1** 🆕 — 10B parameter video diffusion, 848×480 @ 30fps, 24GB VRAM — [guide](https://docs.clore.ai/guides/video-generation/mochi-1)

### Previously Added (February 2026)

* **Qwen3.5** — Alibaba's 397B MoE, beat Claude 4.5 Opus on math
* **GLM-5** — 744B MoE from Zhipu AI, MIT license, #1 in open-source rankings
* **Ling-2.5-1T** — Ant Group's trillion-parameter model with linear attention
* **Kimi K2.5** — Moonshot AI's 1T MoE, MIT license, visual agentic
* **Mistral Large 3** — 675B MoE, Apache 2.0, frontier coding & reasoning
* **Llama 4 Scout/Maverick** — Meta's MoE revolution, 10M context window
* **Gemma 3** — Google's 27B that beats 405B models
* **FLUX.2 Klein** — sub-second image generation (< 0.5s on RTX 4090)
* **HunyuanImage 3.0** — 80B MoE, largest open-source image model
* **ACE-Step 1.5** — full song generation on < 4GB VRAM
* **FramePack** — AI video with just 6GB VRAM
* **Qwen3-TTS** — voice cloning in 10+ languages from 3 seconds of audio
* **Kani-TTS-2** — ultra-lightweight TTS, only 3GB VRAM
* **DeepSeek-R1** — reasoning model matching OpenAI o1

### Previously Added Categories (February 2026)

* 🤖 **AI Platforms & Agents** — 18 guides: Dify, CrewAI, AutoGPT, OpenHands, MetaGPT, n8n, LibreChat, Open Interpreter, SuperAGI, SWE-agent
* 🎵 **Music Generation** — AI-composed songs with vocals (ACE-Step)
* 💻 **AI Coding Tools** — self-hosted Copilot alternatives (Aider, TabbyML)
* 🔧 **OpenClaw on Clore** — run your AI assistant 24/7 on rented GPUs

## 💰 GPU Pricing (April 2026)

> Snapshot of typical Clore.ai marketplace ranges, late-April 2026. See [clore.ai/marketplace](https://clore.ai/marketplace) for live numbers — spot orders typically run 20–40% cheaper than on-demand.

| GPU         | VRAM  | Typical /hr | Best For                                           | Landing Page                                                                      |
| ----------- | ----- | ----------- | -------------------------------------------------- | --------------------------------------------------------------------------------- |
| RTX 3060    | 12GB  | $0.16–1.00  | TTS, small models, music gen                       | —                                                                                 |
| RTX 3070    | 8GB   | $0.17–3.33  | 7B models, Whisper, batch inference                | [Rent](https://clore.ai/rent-3070.html) / [Host](https://clore.ai/host-3070.html) |
| RTX 3080    | 10GB  | $0.20–3.50  | 7B–14B models, image gen                           | [Rent](https://clore.ai/rent-3080.html) / [Host](https://clore.ai/host-3080.html) |
| RTX 3090    | 24GB  | $0.33–4.00  | SDXL, 32B Q4, Ling-2.6-flash INT4                  | [Rent](https://clore.ai/rent-3090.html) / [Host](https://clore.ai/host-3090.html) |
| RTX 4070 Ti | 12GB  | $0.30–2.00  | SDXL, ComfyUI, 7B models                           | [Rent](https://clore.ai/rent-4070-ti.html)                                        |
| RTX 4080    | 16GB  | $0.45–3.50  | FLUX, 14B models, fine-tuning                      | [Rent](https://clore.ai/rent-4080.html)                                           |
| RTX 4090    | 24GB  | $0.70–4.50  | FLUX, Llama 4, Qwen3.6-27B Q4, Ling-2.6-flash      | [Rent](https://clore.ai/rent-4090.html) / [Host](https://clore.ai/host-4090.html) |
| RTX 5080    | 16GB  | $0.90–9.00  | Fast FLUX, 30B LLMs, Blackwell FP4                 | [Rent](https://clore.ai/rent-5080.html)                                           |
| RTX 5090    | 32GB  | $1.72–10.00 | 70B quantized, Qwen3.6-27B Q8, Nemotron 3 Super    | [Rent](https://clore.ai/rent-5090.html) / [Host](https://clore.ai/host-5090.html) |
| RTX A6000   | 48GB  | $1.50–4.00  | 70B BF16, Qwen3.6-27B BF16                         | [Rent](https://clore.ai/rent-a6000.html)                                          |
| L40S        | 48GB  | $2.00–9.00  | FP8 datacenter inference, single-GPU 27B           | [Rent](https://clore.ai/rent-l40s.html)                                           |
| A100 80GB   | 80GB  | $2.00–6.00  | Hy3 Preview FP8, MiniMax M2.7, training            | [Rent](https://clore.ai/rent-a100-80gb.html)                                      |
| H100 80GB   | 80GB  | $4.00–9.00  | DeepSeek V4 Pro, MiMo-V2.5-Pro, frontier inference | [Rent](https://clore.ai/rent-h100.html)                                           |
| H200 141GB  | 141GB | $6.00–14.00 | DeepSeek V4 single-node, full BF16                 | [Rent](https://clore.ai/rent-h200.html)                                           |
| B200 192GB  | 192GB | $8.00–18.00 | Frontier training, MiMo-V2.5-Pro 8× cluster        | [Rent](https://clore.ai/rent-b200.html)                                           |

*Looking to monetize idle GPUs? See the* [*host pages*](https://clore.ai/llms.txt) *for monthly earnings estimates per card type.*

## 🚀 Quick Start

New to Clore.ai? → [**Quickstart Guide**](https://docs.clore.ai/guides/quickstart)

Already know what you need?

| I want to...                  | Start here                                                                                                                                                                                                               |
| ----------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
| Chat with AI locally          | [Ollama Guide](https://docs.clore.ai/guides/language-models/ollama)                                                                                                                                                      |
| Run a frontier coding LLM     | [DeepSeek V4](https://docs.clore.ai/guides/language-models/deepseek-v4), [GLM-5.1](https://docs.clore.ai/guides/language-models/glm-5-1), or [MiMo-V2.5-Pro](https://docs.clore.ai/guides/language-models/mimo-v25-pro)  |
| Run a single-GPU coding LLM   | [Qwen3.6-27B](https://docs.clore.ai/guides/language-models/qwen36-27b) (RTX 4090 Q4) or [Ling-2.6-flash](https://docs.clore.ai/guides/language-models/ling-26-flash)                                                     |
| Generate images               | [FLUX.2 Klein](https://docs.clore.ai/guides/image-generation/flux2-klein) or [ComfyUI](https://docs.clore.ai/guides/image-generation/comfyui)                                                                            |
| Generate videos               | [FramePack](https://docs.clore.ai/guides/video-generation/framepack) (6GB!) or [Wan2.1](https://docs.clore.ai/guides/video-generation/wan-video)                                                                         |
| Clone a voice                 | [Voxtral TTS](https://docs.clore.ai/guides/audio-and-voice/voxtral-tts), [Qwen3-TTS](https://docs.clore.ai/guides/audio-and-voice/qwen3-tts), or [Zonos](https://docs.clore.ai/guides/audio-and-voice/zonos-tts)         |
| Transcribe audio              | [WhisperX](https://docs.clore.ai/guides/audio-and-voice/whisperx)                                                                                                                                                        |
| Fine-tune a model             | [Unsloth](https://docs.clore.ai/guides/training/unsloth-finetune) (2x faster, 70% less VRAM)                                                                                                                             |
| Generate music                | [ACE-Step](https://docs.clore.ai/guides/music-generation/ace-step) (< 4GB VRAM!)                                                                                                                                         |
| Self-host Copilot             | [TabbyML](https://docs.clore.ai/guides/ai-coding-tools/tabby) ($4.50/month)                                                                                                                                              |
| Run an AI agent platform      | [Dify](https://docs.clore.ai/guides/ai-platforms-and-agents/dify), [CrewAI](https://docs.clore.ai/guides/ai-platforms-and-agents/crewai), or [OpenHands](https://docs.clore.ai/guides/ai-platforms-and-agents/openhands) |
| Self-host ChatGPT alternative | [LibreChat](https://docs.clore.ai/guides/ai-platforms-and-agents/librechat) or [LobeChat](https://docs.clore.ai/guides/ai-platforms-and-agents/lobechat)                                                                 |
| Run AI assistant 24/7         | [OpenClaw on Clore](https://docs.clore.ai/guides/other-workloads/openclaw-on-clore)                                                                                                                                      |
| Pick the right GPU            | [GPU Comparison](https://docs.clore.ai/guides/getting-started/gpu-comparison)                                                                                                                                            |

## 📖 Documentation & Support

* **Main Docs**: [docs.clore.ai](https://docs.clore.ai/)
* **These Guides**: [docs.clore.ai/guides](https://docs.clore.ai/guides/)
* **Marketplace**: [clore.ai/marketplace](https://clore.ai/marketplace)
* **Support**: [clore.ai/support](https://clore.ai/support) / <support@clore.ai>
* **Discord**: [discord.com/invite/clore-ai](https://discord.com/invite/clore-ai)
* **Telegram**: [@clorechat](https://t.me/clorechat)


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.clore.ai/guides/readme.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
