🌌 Running Jupyter Notebooks for LLM Fine-tuning on Windows 10: A Comprehensive Guide
Jupyter Notebook has become an indispensable tool for data scientists and machine learning practitioners. Its interactive nature, coupled with the ability to integrate code, visualizations, and explanatory text, makes it particularly well-suited for the iterative experimentation inherent in fine-tuning Large Language Models (LLMs) 1. The increasing prevalence of LLMs across various applications has led to a growing demand for accessible methods to fine-tune these models on local hardware, such as Windows 10 machines. While cloud-based platforms like Google Colab offer a convenient environment, running notebooks locally provides several advantages, including the ability to work offline, exercise greater control over the computing environment, and handle sensitive data securely 3. This report aims to provide a comprehensive, step-by-step guide for users looking to leverage their Windows 10 machines to fine-tune LLMs using Jupyter Notebooks originally developed in Google Colab.
🌟 Step-by-Step Guide to Installing Jupyter Notebook on Windows 10
Several methods are available for installing Jupyter Notebook on a Windows 10 operating system. The most suitable approach often depends on the user’s existing Python configuration and specific needs 5. One highly recommended method, particularly for users in the machine learning domain, is through the Anaconda distribution 2. Anaconda is a comprehensive Python distribution that comes pre-packaged with numerous libraries essential for data science and machine learning, including Jupyter Notebook 2. To install Anaconda, users should navigate to the Anaconda website and download the version compatible with their Windows system, ensuring they select the Python 3 version. The installation process involves executing the downloaded installer and following the on-screen prompts, with careful consideration given to the option of adding Anaconda to the system’s PATH environment variable. Once installed, Jupyter Notebook can be launched either through the Anaconda Navigator, a graphical interface, or via the Anaconda Prompt, a command-line interface 5. The advantage of using Anaconda lies in its streamlined setup, as it bundles many of the fundamental libraries required for LLM fine-tuning, such as NumPy, Pandas, scikit-learn, TensorFlow, and PyTorch 1. For users who already have a Python installation on their Windows 10 machine, Jupyter Notebook can be installed independently using pip, the standard package installer for Python 5. The process involves opening the Command Prompt, ensuring pip is up-to-date by running python -m pip install —upgrade pip 5, and then installing Jupyter Notebook using the command python -m pip install jupyter or pip install notebook 5. After installation, Jupyter Notebook can be launched by typing jupyter notebook in the Command Prompt, which will typically open the application in the default web browser 5. Miniconda presents another viable option. It is a minimal installer that includes only Python, conda (Anaconda’s package and environment manager), and a small number of essential packages 5. This approach provides the benefits of conda’s environment management capabilities without the larger footprint of the full Anaconda distribution. To install Miniconda, users can download the installer from the official website and follow the installation instructions. Once installed, the Anaconda Prompt can be used to create a new environment (optional but recommended) using conda create —name <env_name> python=
Method | Pros | Cons | Recommended Use Case |
---|---|---|---|
Anaconda | Includes many libraries, easy for beginners | Larger installation size, might include unnecessary packages | Beginners, data science workflows |
pip | Lightweight, more control over packages | Requires manual installation of dependencies | Users with existing Python installations, specific package needs |
Miniconda | Lightweight, environment management | Requires manual installation of most libraries | Users wanting environment management without full Anaconda |
Microsoft Store | Convenient for Windows users | Might have version lags, potentially limited customization | Casual users, quick setup |
🌟 Setting Up a Robust Python Environment for LLM Fine-tuning on Windows 10
Establishing an isolated and well-managed Python environment is paramount for LLM fine-tuning projects on Windows 10 7. Employing a virtual environment is a fundamental best practice that helps prevent conflicts between different project dependencies, ensures the reproducibility of results, and maintains a clean and stable main Python installation 15. Users can create and activate virtual environments using either the built-in venv module in Python or conda, the package and environment manager included with Anaconda and Miniconda 7. To use venv, one should open the Command Prompt, navigate to the desired project directory, and execute the command python -m venv <your_environment_name>, replacing <your_environment_name> with a descriptive name for the environment, such as llm_env. The environment can then be activated on Windows using the command <your_environment_name>\Scripts\activate. Alternatively, users of Anaconda or Miniconda can leverage conda to manage their environments. To create a new conda environment, the Anaconda Prompt should be opened, and the command conda create —name <your_environment_name> python=<desired_python_version> should be executed, specifying a descriptive name and the desired Python version (e.g., conda create —name llm_env python=3.10).
Activating the conda environment is done using the command conda activate <your_environment_name>. Similar to venv, the command prompt will reflect the active environment. With the virtual environment activated, the next step is to install the essential Python libraries required for LLM fine-tuning using the pip install command 1. Common libraries include tensorflow or torch (the deep learning framework), transformers and datasets (from Hugging Face), accelerate (for distributed training), scikit-learn, pandas, numpy, and visualization libraries like matplotlib and seaborn. For users with compatible NVIDIA GPUs, it is crucial to install the deep learning framework with GPU support to significantly accelerate the computationally intensive fine-tuning process 17. TensorFlow and PyTorch provide specific installation instructions for GPU support that should be followed carefully. It is important to ensure that all these libraries are installed within the activated virtual environment to maintain project isolation. Users who chose Anaconda for Jupyter Notebook installation will find that many of these libraries are either pre-installed or easily installable using the conda install command. However, even in this case, creating and activating a dedicated conda environment for the LLM project is still recommended for optimal dependency management.
🌟 Seamlessly Managing Dependencies from Google Colab Notebooks
A key aspect of transitioning downloaded Google Colab notebooks to a local Windows 10 environment is managing the Python dependencies 3. Colab notebooks often rely on a specific set of Python packages that might not be installed in the user’s local environment, which can lead to errors when attempting to run the notebook. The most effective method for addressing this is by utilizing a requirements.txt file 20. The first step involves generating the requirements.txt file within the Google Colab notebook itself. This can be done by opening the Colab notebook, creating a new code cell, and executing the magic command !pip freeze > requirements.txt. This command lists all the packages currently installed in the Colab environment, along with their exact versions, and saves this information to a file named requirements.txt. The ! prefix allows the execution of shell commands within the Colab notebook environment. On the local Windows 10 machine, the user should open either the Command Prompt (if using venv) or the Anaconda Prompt (if using conda) and activate the virtual environment created for their LLM fine-tuning project. Navigating to the directory where the requirements.txt file was saved using the cd command is the next step. Finally, all the dependencies listed in the file can be installed by executing the command pip install -r requirements.txt 20. Pip will read the requirements.txt file and install the exact versions of the packages that were present in the Colab environment. While alternative methods exist, such as manually inspecting the Colab notebook for import statements and installing the necessary packages using pip, this can be less reliable and prone to missing dependencies or version mismatches if not performed meticulously. More advanced dependency management tools like pip-tools 20 offer features for maintaining a requirements.in file with top-level dependencies and compiling it into a requirements.txt file with precise versions. However, for the primary purpose of migrating Colab notebooks, the straightforward use of pip freeze and pip install -r is generally sufficient.
🌟 Navigating Common Challenges in LLM Fine-tuning within Jupyter on Windows
Running computationally intensive LLM fine-tuning tasks within Jupyter Notebook on a Windows 10 machine can present several challenges due to the resource-intensive nature of these operations 5. Understanding these potential issues and their solutions is crucial for a smooth workflow. One of the most common challenges is encountering memory issues, often manifested as “Out of Memory” errors 22. LLM models and the fine-tuning process can consume substantial amounts of both RAM and GPU memory (if available). To mitigate this, several strategies can be employed. Reducing the batch size during training is a common approach, as it directly impacts the amount of memory required, although it might increase the overall training time. Gradient accumulation is another technique that allows for simulating larger batch sizes without exceeding memory limits by accumulating gradients over multiple smaller batches. For models that are particularly large, exploring model quantization techniques, such as 8-bit or 4-bit quantization, can significantly reduce the memory footprint 18. Monitoring the system’s memory usage using the Windows Task Manager can help identify potential bottlenecks. If Jupyter becomes unresponsive due to memory issues, restarting the kernel can free up memory 24. Performance issues, particularly slow training times, are another common concern 22. LLM training involves numerous complex computations that can be time-consuming, especially when relying solely on the CPU. If the Windows 10 machine has a compatible NVIDIA GPU, ensuring that the chosen deep learning framework (TensorFlow or PyTorch) is configured to utilize it is paramount, as this can lead to significant speed improvements 17. Optimizing the code by using efficient algorithms and leveraging vectorized operations provided by libraries like NumPy can also help 22. While parallel computing libraries like multiprocessing or Dask can be used to distribute computations across multiple CPU cores, the speedup might be limited for tasks that are primarily GPU-bound 22. Jupyter kernel crashes or unresponsiveness can occur due to memory overload, long-running processes, or issues with dependencies 22. Restarting the kernel is often the quickest solution to recover from a temporary hang. Clearing the output of notebook cells can also help, as excessive output can consume memory and slow down the notebook 22. Breaking down the fine-tuning process into smaller, manageable code cells can also improve stability. It is also crucial to save the notebook frequently to avoid losing progress in case of a crash. Ensuring that all required libraries are installed correctly and there are no version conflicts is another important step in maintaining a stable environment.
🌟 Optimizing Hardware for Peak LLM Fine-tuning Performance on Windows 10
The optimal hardware configuration for a Windows 10 machine intended for fine-tuning LLM models depends on the size of the model, the dataset, and the desired training speed 18. The Graphics Processing Unit (GPU) is arguably the most critical component for LLM tasks due to its parallel processing capabilities 18. NVIDIA GPUs are generally recommended due to their strong CUDA support, which is widely used by popular deep learning frameworks 17. The amount of VRAM (Video RAM) on the GPU is a primary limiting factor, as it dictates the size of the LLM and the batch size that can be used 18. For smaller LLMs (e.g., 7B parameters), consumer-grade GPUs like the NVIDIA GeForce RTX 3090 or RTX 4090 with 24GB of VRAM offer excellent performance 18. Professional options like the NVIDIA A5000 or A6000 with 24GB-48GB VRAM provide more reliability and better multi-GPU scaling 18. For larger LLMs (e.g., 70B parameters), high-end GPUs such as the NVIDIA A100 or H100 with 40GB or 80GB of VRAM are highly recommended 18. Utilizing multiple GPUs connected via NVLink can further accelerate training for these larger models 18. While the GPU handles the main computations, a multi-core Central Processing Unit (CPU) with a high clock speed is important for data preprocessing and other background tasks 18. CPUs like AMD Ryzen 7 or 9, or Intel Core i7 or i9 offer a good balance for most scenarios 18. For more demanding workloads or multi-GPU setups, AMD Ryzen Threadripper or Intel Xeon processors, which provide more cores and PCIe lanes, are recommended 18. Sufficient Random Access Memory (RAM) is crucial for handling large datasets and loading models 18. A minimum of 32GB of DDR4 or DDR5 RAM is suggested for lighter workloads, with 64GB or more being ideal for larger datasets or CPU offloading 18. For very large-scale fine-tuning, 128GB or more might be necessary 19. A general guideline is to have at least double the amount of CPU memory as the total GPU memory in the system 31. For storage, using fast NVMe Solid State Drives (SSDs) with a capacity of 1TB or more is recommended for storing the dataset, model checkpoints, and logs 18. NVMe drives offer significantly faster read/write speeds compared to traditional Hard Disk Drives (HDDs), which can reduce loading times. For very large datasets, considering 2TB or larger NVMe SSDs is advisable 18.
🌟 Boosting Jupyter Notebook Performance for Intensive LLM Workloads
Optimizing the performance of Jupyter Notebook itself is crucial for efficiently handling intensive LLM fine-tuning workloads on Windows 22. This can be achieved through various strategies. Writing efficient code is paramount. This includes using optimized algorithms and appropriate data structures, such as leveraging NumPy arrays for numerical operations 22. Utilizing vectorized operations in libraries like NumPy and Pandas is generally more efficient than using explicit loops 23. Avoiding unnecessary loops and pre-processing data to reduce the amount of data being processed in later stages can also significantly improve performance 23. Managing memory within Jupyter is also critical. Limiting the output displayed in notebook cells can prevent excessive memory consumption and slowdowns 22. Deleting variables that are no longer needed can free up memory. If memory usage becomes excessive, restarting the kernel can be beneficial 24. For very large datasets, consider loading and processing data in smaller chunks. Leveraging external libraries and specialized techniques can further enhance performance. Parallel computing libraries like multiprocessing or Dask can be used to utilize multiple CPU cores for certain tasks 22. Exploring techniques like gradient accumulation can help simulate larger batch sizes with limited GPU memory. Specialized libraries like Unsloth 32 are designed to optimize and speed up LLM fine-tuning. Jupyter Notebook extensions, such as Hinterland for auto code-completion 22, and magic commands, like %%time for profiling cell execution time 36 and %env for managing environment variables 34, can also aid in improving productivity and performance analysis. Using a modern and well-optimized web browser and keeping it updated is also recommended 22. Finally, being mindful of the size and content of the Jupyter notebook itself, and potentially breaking down large notebooks or converting parts of the code into reusable Python scripts, can prevent performance degradation 33.
🌟 Efficiently Running Downloaded Colab Notebooks Locally
Running Jupyter Notebooks downloaded from Google Colab efficiently on a local Windows 10 environment requires careful attention to dependency management and potentially utilizing specific features designed for this transition 3. As previously discussed, managing dependencies using a requirements.txt file generated in Colab is a crucial first step. Beyond dependency management, Google Colab offers a Local Runtime feature that can be particularly useful 37. This feature allows users to connect the Google Colab web interface to a Jupyter server running locally on their Windows 10 machine. This can be advantageous for those who are more familiar with the Colab environment or wish to leverage some of its specific functionalities while utilizing their local hardware resources. Setting up a local runtime involves several steps: First, ensure Jupyter Notebook is installed on the Windows 10 machine. Next, install the jupyter_http_over_ws extension using pip and enable the server extension. Then, start a local Jupyter server with specific arguments, including allowing origins from https://colab.research.google.com and specifying a port. Once the server is running, it will provide a URL with a token that needs to be copied. Finally, in the Google Colab notebook, users can click the “Connect” button, select “Connect to local runtime,” and paste the copied URL into the dialog box. While installing the colab package locally using pip install colab is possible 3, it does not fully replicate the entire Colab environment. Users should also be aware that there might be differences in the pre-installed libraries, file system structure, and access to Google Drive between Colab and a local setup.
🌟 Exploring Alternatives: Virtual Environments, Cloud Services, and Other IDEs for LLM Fine-tuning on Windows
When considering the best way to run Python code for LLM fine-tuning on Windows 10, several approaches exist, each with its own set of advantages and disadvantages 1. Utilizing a virtual environment on a local machine offers complete control over the development environment, does not require an internet connection after setup, and can be cost-effective for users with sufficient local hardware. However, it is limited by the computational resources available locally and requires manual setup and management. This approach is best suited for users who have adequate local hardware and prefer a fully customizable environment for their LLM fine-tuning tasks.
Cloud services, such as Google Colab, AWS, Azure, and GCP, provide access to powerful and scalable computing resources, including high-end GPUs and TPUs 1. They often offer managed environments with many necessary libraries pre-installed and may include collaboration features. However, these services can incur costs depending on the resources used and require a stable internet connection 18.
Other Integrated Development Environments (IDEs) like PyCharm, VS Code, and Spyder offer more structured and feature-rich development environments compared to Jupyter Notebook 1. These IDEs provide advanced code editing, debugging, and project management tools and can be seamlessly integrated with virtual environments 15. While they still rely on local hardware unless connected to a remote server, they can offer a more organized workflow, especially for larger projects.
Approach | Pros | Cons | Best Suited For |
---|---|---|---|
Local Jupyter with Virtual Env | Full control, no internet needed after setup, cost-effective for available resources, handles sensitive data locally | Limited by local hardware, requires setup and management | Users with adequate local hardware, offline work, sensitive data |
Cloud Services | Scalable resources (GPUs/TPUs), managed environments, collaboration features | Costly, requires internet access, potential data transfer overhead, potential privacy concerns | Users with large models, need for high computational power, collaborative projects |
Other IDEs (PyCharm, VS Code, Spyder) | Enhanced features for project management, code navigation, and debugging, integrates with virtual environments | Might have a learning curve, still limited by local hardware unless remote connection is used | Users working on complex projects, preferring a structured IDE environment, needing advanced debugging features |
The optimal approach ultimately depends on the individual user’s hardware capabilities, budget, internet access, familiarity with different tools, and the specific demands of their LLM fine-tuning tasks.
🌟 Conclusion
Running Jupyter Notebooks for LLM fine-tuning on a Windows 10 machine is indeed achievable by adhering to a set of best practices. This report has outlined the necessary steps, from selecting the appropriate installation method for Jupyter Notebook to setting up a robust Python environment using virtual environments. Managing dependencies from Google Colab notebooks effectively through requirements.txt files is crucial for a smooth transition. Addressing common challenges such as memory limitations and performance bottlenecks requires a combination of code optimization, memory management techniques, and potentially leveraging specialized libraries and hardware acceleration. Optimizing the Windows 10 machine’s hardware, particularly the GPU, RAM, and storage, plays a significant role in achieving peak performance. Furthermore, exploring the Google Colab Local Runtime feature can offer a familiar interface for users transitioning from the cloud. Finally, considering alternative approaches like cloud services or other IDEs provides flexibility based on specific project needs and available resources.
🔧 Works cited
1. 7 Best Python IDE for Data Science and Machine Learning - ProjectPro, accessed on March 23, 2025, https://www.projectpro.io/article/best-python-ide-for-data-science-and-machine-learning/812 2. How to Use Jupyter Notebook: A Beginner’s Tutorial - Dataquest, accessed on March 23, 2025, https://www.dataquest.io/blog/jupyter-notebook-tutorial/ 3. How to Run Google Colab Locally: A Step-by-Step Guide - LinkSprite, accessed on March 23, 2025, https://www.linksprite.com/how-to-run-google-colab-locally-a-step-by-step-guide/ 4. How to Run Google Colab Locally: A Step-by-Step Guide | Saturn …, accessed on March 23, 2025, https://saturncloud.io/blog/how-to-run-google-colab-locally-a-step-by-step-guide/ 5. How to Install Jupyter Notebook on Windows - GeeksforGeeks, accessed on March 23, 2025, https://www.geeksforgeeks.org/install-jupyter-notebook-in-windows/ 6. How to download and install Jupyter Notebook for Windows 10 / 11 with Python tutorial, accessed on March 23, 2025, https://www.youtube.com/watch?v=HLD-Ll_-IT4 7. Setup Python environment for ML - Machine Learning Plus, accessed on March 23, 2025, https://www.machinelearningplus.com/machine-learning/setup-python/ 8. 6 Best Python IDEs for Data Science in 2025 - DataCamp, accessed on March 23, 2025, https://www.datacamp.com/tutorial/data-science-python-ide 9. Python IDEs for Windows - GeeksforGeeks, accessed on March 23, 2025, https://www.geeksforgeeks.org/python-ide-for-windows/ 10. 20 Most Popular Python IDEs in 2025: Code Like a Pro - Simplilearn.com, accessed on March 23, 2025, https://www.simplilearn.com/tutorials/python-tutorial/python-ide 11. How to Install Jupyter Notebook in Windows 10 - YouTube, accessed on March 23, 2025, https://www.youtube.com/watch?v=0z493LWLWQw 12. Project Jupyter | Installing Jupyter, accessed on March 23, 2025, https://jupyter.org/install 13. How to Download and Install Jupyter Notebook for Windows 10/11[2024] - YouTube, accessed on March 23, 2025, https://www.youtube.com/watch?v=Xz5XE8zmaJI 14. Module 1: Python environment setup - Colab - Google, accessed on March 23, 2025, https://colab.research.google.com/github/yy/dviz-course/blob/master/docs/m01-intro/lab01.ipynb 15. Virtual environment vs IDE(pyCharm). Which should I do? Both …, accessed on March 23, 2025, https://www.reddit.com/r/Python/comments/3eou7o/virtual_environment_vs_idepycharm_which_should_i/ 16. Explain why Python virtual environments are “better”? [closed] - Stack Overflow, accessed on March 23, 2025, https://stackoverflow.com/questions/72835581/explain-why-python-virtual-environments-are-better 17. GPU Execution in a Jupyter Notebook | by Nadira Ahmadi - Medium, accessed on March 23, 2025, https://medium.com/@nahmadics/gpu-execution-in-a-jupyter-notebook-1464d0d73f39 18. Guide to Hardware Requirements for Training and Fine-Tuning …, accessed on March 23, 2025, https://towardsai.net/p/artificial-intelligence/guide-to-hardware-requirements-for-training-and-fine-tuning-large-language-models 19. Recommended Hardware for Running LLMs Locally - GeeksforGeeks, accessed on March 23, 2025, https://www.geeksforgeeks.org/recommended-hardware-for-running-llms-locally/ 20. Production Jupyter notebooks: A guide to managing dependencies …, accessed on March 23, 2025, https://blog.reviewnb.com/jupyter-notebook-reproducibility-managing-dependencies-data-secrets/ 21. Execute a Jupyter notebook with Airflow | Astronomer Documentation, accessed on March 23, 2025, https://www.astronomer.io/docs/learn/execute-notebooks/ 22. How to Optimize Jupyter Notebook Performance ? - GeeksforGeeks, accessed on March 23, 2025, https://www.geeksforgeeks.org/how-to-optimize-jupyter-notebook-performance/ 23. Jupyter Notebook Increase CPU Usage - MS. Codes, accessed on March 23, 2025, https://ms.codes/blogs/computer-hardware/jupyter-notebook-increase-cpu-usage 24. Jupyter Crashing - MemVerge, accessed on March 23, 2025, https://memverge.com/blog/jupyter-crashing/ 25. Fine-Tuning Open-Source LLM using QLoRA with MLflow and PEFT, accessed on March 23, 2025, https://mlflow.org/docs/latest/llms/transformers/tutorials/fine-tuning/transformers-peft.html 26. Easiest Way to Fine-tune LLMs Locally + Code - NO GPU Needed! - YouTube, accessed on March 23, 2025, https://www.youtube.com/watch?v=oG0jsMVTg9w 27. Lord of the Notebooks: Optimizing Jupyter | by Bryan Santos | Medium, accessed on March 23, 2025, https://medium.com/@bryan.santos/lord-of-the-notebooks-optimizing-jupyter-9cc168debcc7 28. 7 Jupyter Notebook Tips and Tricks to Maximize Your Productivity - MakeUseOf, accessed on March 23, 2025, https://www.makeuseof.com/jupyter-notebook-tips-tricks/ 29. Quick start for fine-tuning LLMs - Mostly AI, accessed on March 23, 2025, https://mostly.ai/docs/quick-start/fine-tuning-llms 30. Fine-Tuning LLMs using Intel CPUs - Lenovo Press, accessed on March 23, 2025, https://lenovopress.lenovo.com/lp2179-fine-tuning-llms-using-intel-cpus 31. Hardware Recommendations for Machine Learning / AI - Puget Systems, accessed on March 23, 2025, https://www.pugetsystems.com/solutions/ai-and-hpc-workstations/machine-learning-ai/hardware-recommendations/ 32. Unsloth Guide: Optimize and Speed Up LLM Fine-Tuning | DataCamp, accessed on March 23, 2025, https://www.datacamp.com/tutorial/unsloth-guide-optimize-and-speed-up-llm-fine-tuning 33. Optimizing Jupyter Notebooks for LLMs - Alex Molas, accessed on March 23, 2025, https://www.alexmolas.com/2025/01/15/ipynb-for-llm.html 34. 28 Jupyter Notebook Tips, Tricks, and Shortcuts - Dataquest, accessed on March 23, 2025, https://www.dataquest.io/blog/jupyter-notebook-tips-tricks-shortcuts/ 35. jupyter notebook extremely slow!!! · Issue #6438 - GitHub, accessed on March 23, 2025, https://github.com/jupyter/notebook/issues/6438 36. 10 Jupyter Notebook Tips and Tricks for Data Scientists - KDnuggets, accessed on March 23, 2025, https://www.kdnuggets.com/2023/06/10-jupyter-notebook-tips-tricks-data-scientists.html 37. Local runtimes - Google Colab, accessed on March 23, 2025, https://research.google.com/colaboratory/local-runtimes.html 38. How to connect Google Colab to your local computer with Jupyter Notebook, accessed on March 23, 2025, https://ruslanmv.com/blog/How-to-connect-Google-Colab-to-your-computer 39. Connect Google Colab to a Local Runtime using Jupyter | by Dipan Saha - Medium, accessed on March 23, 2025, https://medium.com/@dipan.saha/connect-google-colab-to-a-local-runtime-using-jupyter-348b7d05e3bb 40. Can you run Google Colab on your local computer? - Stack Overflow, accessed on March 23, 2025, https://stackoverflow.com/questions/63087188/can-you-run-google-colab-on-your-local-computer 41. The best LLM tools for software development - Symflower, accessed on March 23, 2025, https://symflower.com/en/company/blog/2024/ai-tools-software-testing/ 42. My experience on starting with fine tuning LLMs with custom data : r/LocalLLaMA - Reddit, accessed on March 23, 2025, https://www.reddit.com/r/LocalLLaMA/comments/14vnfh2/my_experience_on_starting_with_fine_tuning_llms/ 43. Private LLMs on Your Local Machine and in the Cloud With LangChain, GPT4All, and Cerebrium | by Sami Maameri | Better Programming - Medium, accessed on March 23, 2025, https://medium.com/better-programming/private-llms-on-local-and-in-the-cloud-with-langchain-gpt4all-and-cerebrium-6dade79f45f6