Stephen’s Excellent Adventure into AI 🤖✨
Alright Stephen, get ready to dive into the exciting world of Artificial Intelligence! 🌊 Since I’ll be on vacation for a month, I wanted to put together this guide to help you get started and comfortable with the AI landscape and terminology. I made it super annoying with lots of emojis.
🎯 Goal: Hands-on experience with online AI tools. Internal system training can wait – just get familiar with what’s out there!
🌌 🌐 Chapter 1: Dipping Your Toes in the Online AI Pool
First things first, you need access to the main online AI platforms. I’m arranging with Justin to allow you to get Pro accounts for Gemini and Claude. These run ~$25/month and are essential for full access. Once approved:
- Stephen (that’s you!) will handle getting these accounts setup.
🌟 🚀 Meeting Gemini: Your Online AI Companion
Gemini is a versatile AI, particularly good at debugging. It’s one of the tools I rotate through daily (alongside Claude, ChatGPT, and Co-Pilot) – some are better than others at certain tasks. Get familiar with the standard Gemini app first. Then, Google AI Studio is your next step! This is where a lot of new features show up first, so definitely explore it. It’s also where you’ll find the API section to access API tools, which will be useful later for setting up RooCode.
💡 > 💡 Tip: Google AI Studio is the go-to place for early access features and API tools. Don’t skip it!
🌌 🤝 Chapter 2: Befriending Claude, MCPs, and the Context Window
Claude is another excellent AI, and I find it particularly great for making code. To really unlock its potential, you’ll want to get Claude Desktop set up and integrate it with Model Context Protocol (MCP) servers.
Download Claude Desktop from:
https://support.anthropic.com/en/articles/10065433-installing-claude-for-desktop
It’s available for macOS (version 11 or higher) and Windows (Windows 10 or higher). Currently, there’s no official support for Linux.
⚠️ Note: Make sure to keep the Claude Desktop application updated to get the latest features. You can usually check for updates in the Claude menu.
🌟 🧠 Understanding the Context Window
Here’s a crucial concept when working with chatbots, especially for coding with tools like Claude Desktop: the context window.
The context window is essentially the amount of conversation history (your input and the AI’s responses) that the AI can remember and consider when generating its next response. Think of it as the AI’s short-term memory for that particular chat session.
- *Why is this important? Chatbots have limitations on how much information they can hold in their context window. If a conversation becomes too long, the AI starts to “forget” the earlier parts. When you’re using Claude Desktop with MCPs for coding, you might find the context window is relatively small. This means you need to be strategic in how you interact with the AI, breaking down tasks into smaller parts and ensuring the AI has the necessary information within its current “memory” to complete the request before it runs out of context.
💡 > 💡 Tip: Break tasks into smaller parts to prevent Claude from “forgetting” earlier messages.
🌟 ⚙️ Configuring MCP Servers
MCP (Model Context Protocol) is an open standard developed by Anthropic to allow large language models to communicate securely with external tools and data sources. Think of it as a “USB-C port for AI.” Setting up MCP servers for Claude Desktop is like giving it the ability to interact with your computer and external services.
-
mcp-installer
-
mcp-server-fetch
-
server-filesystem
-
server-brave-search
-
server-win-cli
⚡ Make sure you’ve installed:
-
Node.js
-
npm
-
npx
-
uv package manager
node —version npm —version
💡 > 💡 Tip: Use Gemini to generate a PowerShell script that checks/installs all these dependencies. Great practice!
🌟 📝 Editing the Configuration File
The configuration for MCP servers is done by editing the claude_desktop_config.json file. The PDF guide tells you exactly where to find this file based on your operating system (macOS or Windows).
💡 > 💡 Tip: You can find and edit this file via Claude Desktop → Settings… → Developer tab.
⚠️ > ⚠️ Warning: After editing, fully quit Claude Desktop and reopen it to apply changes.
🌟 🔨 Verifying & Using MCPs
Once Claude Desktop restarts with the updated configuration, you should see a hammer icon in the bottom right of the chat input box. Clicking this icon will show you the available tools provided by your configured MCP servers. You’ll see tools for filesystem operations (like read_file, write_file, search_files) and terminal commands (execute_command).
Try commands like:
Read the file C:\Users\Stephen\Documents\my_notes.txt
Execute the command ipconfig in the terminal
💡 > 💡 Tip: If the hammer icon doesn’t show, it indicates the MCP server is not configured correctly or running. Check your JSON syntax in the config file, verify Node.js/npm installation, or check the Claude Desktop log files for errors (located in ~/Library/Logs/Claude/ on macOS and %APPDATA%\Claude\logs\ on Windows). Ask Claude about the log by pasting it in the chat, sometimes its helpful
Security Considerations: Granting AI access to your file system and command line comes with risks. Only allow access to necessary directories and carefully review commands before authorizing them. Keep your software updated and be aware of broader AI security risks like prompt injection..
Getting Claude Desktop fully configured with these MCPs will give you a fantastic little coding assistant and teach you a lot about how these models can interact with your local environment. It might be a bit frustrating with the context window at times, but experimenting with it is the best way to learn its quirks and how to work effectively within its limits.
🌌 🧩 Bonus: RooCode Plugin for VSCode
If you like using Claude Desktop for coding assistance and you use VSCode (a popular code editor), you might also want to check out the RooCode plugin. You can add MCP capabilities to RooCode as well, but it often comes with command-line and file system access built-in, and you can actually see what it’s doing on the command line. It supports various AI providers, including Google Gemini.
- Getting Started with RooCode: You can use an API key from the API section in Google AI Studio to connect RooCode to models..
⚠️ > ⚠️ Warning: RooCode can be a bit “hungry” and burn through API credits quickly! Don’t auto-run unless you’re using a free-tier key. I once got lazy and put my enterprise API key in auto-mode and racked up a $50 bill in about 45 minutes updating a client’s website! If you’re using a pro account API key, it should just stop working when you hit the limit, which is much safer!.
🌌 🌍 Chapter 3: Exploring the Broader AI Ecosystem
Beyond Gemini and Claude, there are other important resources to be aware of. Visit the Ollama website and browse their model list. Ollama is the engine we use for our local AI, Mount-E Chat, allowing us to run large language models locally. The models listed on their site are generally vetted to some degree.
Hugging Face is another massive and essential platform. Think of it as the GitHub of AI. It’s a huge community hub where people share AI models, datasets, and tools. They host over a million models, datasets, and apps.
💡 > 💡 Tip: Don’t get overwhelmed. Just know it’s the main spot for model sharing and discovery.
🌌 🏔️ Chapter 4: Getting Familiar with Mount-E Chat (Our Internal AI)
Finally, you need to get comfortable with our own internal AI system, Mount-E Chat, which uses OpenWebUI as its frontend interface. It’s built on a secure, on-premises deployment model. Get with Larry to make sure you have access to the staging environment (I think you already do!).
Play around in there and understand the interface and how the administration works – it’s pretty straightforward. It runs on Ollama using Docker and Docker Compose for management. Our current setup uses a Debian workstation powered by an NVIDIA RTX 4090 GPU, but we’re planning a significant upgrade to a more powerful server system within the next year.
- Tools and Filters: One of the key areas I’m working on in OpenWebUI is the filter tools. These are used to create JSON-outputted templates, which is how I build things like briefing notes and other creator models. I’ve put a Template Filter file in there that explains how the filter works with lots of explanations. Again, I get AI to do most of the heavy lifting on this – I’m totally lazy when it comes to repetitive tasks
🌌 🏁 Wrapping Up: Your AI Journey Begins!
You’ve got the tools, now go play! 🕹️
✅ Get hands-on with Gemini and Claude.
✅ Learn the strengths and quirks of each.
✅ Learn about MCP with Claude Desktop
✅ Learn Mount-E chat Interface and Filters
✅ Switch, explore, break things (safely!) and ask the team when you’re stuck.
AI is growing fast – the best way to learn is to dive in and experiment.
See you soon, Stephen! Have fun while I’m gone! 🚀😄