Crazy-Deep-Research-GPT: A Fully Local Web-Based AI Research Interface with Ollama

Dec 2, 2025

Introduction

Crazy-Deep-Research-GGPT is a hyper-functional, sleek, and fully local web-based AI interface built on Node.js and powered by Ollama. Designed for deep exploration or quick AI responses, it supports multiple intelligent research modes alongside customizable AI behaviors. This open-source project utilizes HTML-to-Markdown conversion and emphasizes user privacy with zero tracking or API keys required.

Key Features

  • HTML-to-Markdown Conversion: Uses Turndown via Cheerio for seamless transformation of web content to Markdown format, enabling readable and exportable AI outputs.
  • Ollama Model Integration: Supports local large language models like qwen2.5vl:3b and deepseek-r1:1.5b for powerful AI capabilities without requiring cloud-based APIs.
  • Multiple Research Modes: Includes Normal, Deep Think, Search, Deep Research Lite, Heavy Duty Deep Research, and Auto modes for tailored responses ranging from quick answers to comprehensive research.
  • Customizable AI Settings: Users can tweak system prompts, select primary and auxiliary models, adjust temperature for creativity, and manage WebSocket concurrency limits.
  • Fully Local and Private: All AI processing is local, ensuring no tracking and no need for API keys, preserving user data privacy.
  • Responsive UI: Built with Tailwind CSS, the interface is polished and mobile-friendly, ensuring smooth user experience.
  • Location-Based Support: Enables AI queries involving local context such as nearby restaurants or weather updates.

Installation Guide

To install Crazy-Deep-Research-GPT on your machine, follow these steps:

  • Clone the Repository:
    git clone https://github.com/crazystuffxyz/Crazy-Deep-Research-GPT.git
    cd Crazy-Deep-Research-GPT
  • Install Dependencies:
    npm install
  • Configure Ollama:
    Ensure Ollama is installed and running on your machine. Pull the recommended models:
ollama pull qwen2.5vl:3b
ollama pull deepseek-r1:1.5b

Verify these models appear in the advanced settings of the interface.

How to Use

After installation, start the local server and interact with your personal AI:

  • Run the server with npm start.
  • Open your web browser and navigate to http://localhost:3000.
  • Select one of the intelligent AI modes such as Normal, Deep Think, Search, Deep Research Lite, or Heavy Duty Deep Research.
  • Type your query in the input area and send it to receive AI-generated responses in the chat window.

Code Examples

const express = require('express');
const app = express();
app.use(express.json());

app.post('/api/query', async (req, res) => {
 const userQuery = req.body.query;
 const response = await runOllamaModel(userQuery);
 res.json({ answer: response });
});

app.listen(3000, () => {
 console.log('Server running on http://localhost:3000');
});

This snippet illustrates a basic Node.js Express server setup for handling AI queries using Ollama models locally.

Contribution Guide

Contributions to Crazy-Deep-Research-GPT are welcome, especially for improving AI integration, UI/UX polishing, and extending feature sets. Contributors should adhere to the MIT License guidelines and maintain respect for the privacy-first architecture. Ensure pull requests include clear descriptions and tests where applicable.

Community & Support

Community support is primarily managed through the GitHub repository’s issues section. Users and contributors can report bugs, request features, and discuss improvements. The open-source nature encourages collaboration and knowledge sharing.

Conclusion

Crazy-Deep-Research-GPT is a privacy-centric, locally run AI research interface ideal for developers and enthusiasts seeking powerful AI-assisted research without reliance on online APIs. Its flexibility, mode variety, and customization make it a standout open-source tool.

What is Crazy-Deep-Research-GPT?

Crazy-Deep-Research-GPT is a fully local, web-based AI research interface powered by Ollama that allows deep and customizable AI research without requiring API keys or tracking.

How do I install the required AI models?

You need to install Ollama on your machine and pull the necessary models such as qwen2.5vl:3b and deepseek-r1:1.5b using Ollama’s CLI commands before using the interface.

Are there any API keys or tracking involved?

No, all AI processing is done locally on your machine, so no API keys or user tracking is involved, enhancing your privacy and control.

What research modes are available?

The interface offers various modes including Normal, Deep Think, Search, Deep Research Lite, Heavy Duty Deep Research, and an Auto mode for different research depth and complexity needs.

How can I contribute to the project?

You can contribute by improving features, fixing bugs, or enhancing the user interface. Contributions should follow the MIT license and respect the project’s privacy-first design.