Midori AI Subsystem Manager
How Docker Works
Docker is a containerization platform that allows you to package and run applications in isolated and portable environments called containers. Containers share the host operating system kernel but have their own dedicated file system, processes, and resources. This isolation allows applications to run independently of the host environment and each other, ensuring consistent and predictable behavior.
Midori AI Subsystem - Github Link
The Midori AI Subsystem extends Docker’s capabilities by providing a modular and extensible platform for managing AI workloads. Each AI system is encapsulated within its own dedicated Docker image, which contains the necessary software and dependencies. This approach provides several benefits:
- Simplified Deployment: The Midori AI Subsystem provides a streamlined and efficient way to deploy AI systems using Docker container technology.
- Eliminates Guesswork: Standardized configurations and settings reduce complexities, enabling seamless setup and management of AI programs.
Warnings / Heads up
- This program is in beta! By using it you take on risk, please see the disclaimer in the footnotes
- The Webserver should be back up, sorry for the outage
Known Issues
- AnythingLLM Backend seems to be not upserting files, please report if you are having this bug to the github
- Report Issuses -> Github Issue
Windows Users
- There seems to be false positive from virus checkers, this file is safe to download, check here for the code
- This seems to be a widely known bug with Google Chorme, Edge, and others, here are our virus scans from a few websites. We will try other ways of packing the files.
Install Midori AI Subsystem Manager
- As we are in beta, we have implemented telemetry to enhance bug discovery and resolution. This data is anonymized and will be configurable when out of beta.
Prerequisites
Recommended
Please make a folder for the Manager program with nothing in it, do not use the user folder.
Quick install
- Download - https://tea-cup.midori-ai.xyz/download/model_installer_windows.zip
- Unzip into LocalAI folder
- Run
subsystem_manager.exe
Quick install with script
Open a Command Prompt or PowerShell terminal and run:
curl -sSL https://raw.githubusercontent.com/lunamidori5/Midori-AI/master/other_files/model_installer/shell_files/model_installer.bat -o subsystem_manager.bat && subsystem_manager.bat
Manual download and installation
Open a Command Prompt or PowerShell terminal and run:
curl -sSL https://tea-cup.midori-ai.xyz/download/model_installer_windows.zip -o subsystem_manager.zip
powershell Expand-Archive subsystem_manager.zip -DestinationPath .
subsystem_manager.exe
Prerequisites
or
Docker Engine and Docker Compose
Quick install with script
curl -sSL https://raw.githubusercontent.com/lunamidori5/Midori-AI/master/other_files/model_installer/shell_files/model_installer.sh | sh
Manual download and installation
Open a terminal and run:
curl -sSL https://tea-cup.midori-ai.xyz/download/model_installer_linux.tar.gz -o subsystem_manager.tar.gz
tar -xzf subsystem_manager.tar.gz
chmod +x subsystem_manager
./subsystem_manager
Prerequisites
Download and set up Docker Compose Plugin
Manual download and installation
Copy and paste this into the Docker Compose Manager plugin
services:
midori_ai_unraid:
image: lunamidori5/deb11_subsystem_manager:master
ports:
- 39090:9090
privileged: true
restart: always
tty: true
volumes:
- /var/lib/docker/volumes/midoriai_midori-ai-models/_data:/var/lib/docker/volumes/midoriai_midori-ai-models/_data
- /var/lib/docker/volumes/midoriai_midori-ai-images/_data:/var/lib/docker/volumes/midoriai_midori-ai-images/_data
- /var/lib/docker/volumes/midoriai_midori-ai-audio/_data:/var/lib/docker/volumes/midoriai_midori-ai-audio/_data
- /var/run/docker.sock:/var/run/docker.sock
volumes:
midori-ai:
external: false
midori-ai-audio:
external: false
midori-ai-images:
external: false
midori-ai-models:
external: false
Start up that docker then run the following in it by clicking console
curl -sSL https://raw.githubusercontent.com/lunamidori5/Midori-AI/master/other_files/midori_ai_manager/subsystem_python_runner.py > subsystem_python_runner.py && python3 subsystem_python_runner.py
Running the program
Start up that docker then run the following in it by clicking console
python3 subsystem_python_runner.py
Prerequisites
Recommended
Do not use on windows
Please make a folder for the Manager program with nothing in it, do not use the user folder.
Quick install with script
Download this file
curl -sSL https://raw.githubusercontent.com/lunamidori5/Midori-AI/master/other_files/midori_ai_manager/subsystem_python_runner.py > subsystem_python_runner.py && python3 subsystem_python_runner.py
Running the program
Open a terminal and run:
python3 subsystem_python_runner.py
Reminder to always use your computers IP address not localhost
when using the Midori AI Subsystem!
Support and Assistance
If you encounter any issues or require further assistance, please feel free to reach out through the following channels:
- Midori AI Github: Github Issue
- Midori AI Email: Email Us
- Midori AI Discord: Join our Discord server
—– Install Backends —–
- LocalAI - For LLM inference, Embedding, and more
- AnythingLLM - For chating with your docs using LocalAI or other LLM hosts
- Big-AGI - For chating with your docs using LocalAI or other LLM hosts
- InvokeAI - For making photos using AI models
- Axolotl - For training your own fine tuned LLMs
—– Model Info and Links —–
Check out our Model Repository for info about the models used and supported!
—– FAQs about the subsystem —–
-
What is the purpose of the Midori AI Subsystem?
- The Midori AI Subsystem is a modular and extensible platform for managing AI workloads, providing simplified deployment, standardized configurations, and isolation for AI systems.
-
How does the Midori AI Subsystem simplify AI deployment?
- The Midori AI Subsystem simplifies AI deployment by providing a streamlined and efficient way to deploy AI systems using Docker container technology, reducing complexities and ensuring consistent and predictable behavior.
-
What are the benefits of using the Midori AI Subsystem?
- The benefits of using the Midori AI Subsystem include simplified deployment, standardized configurations, isolation for AI systems, and a growing library of supported backends and tools.
-
What are the limitations of the Midori AI Subsystem?
- The limitations of the Midori AI Subsystem include its current beta status, potential for bugs, and reliance on Docker container technology.
-
What are the recommended prerequisites for using the Midori AI Subsystem?
- The recommended prerequisites for using the Midori AI Subsystem include Docker Desktop Windows or Docker installed on other operating systems, and a dedicated folder for the Manager program.
-
How do I install the Midori AI Subsystem Manager?
- You can install the Midori AI Subsystem Manager by downloading the appropriate package for your operating system from the Midori AI Subsystem website and following the installation instructions. Click here to go to the Midori AI Subsystem website
-
Where can I find more information about the Midori AI Subsystem?
- You can find more information about the Midori AI Subsystem on the Midori AI Subsystem website, which includes documentation, tutorials, and a community Discord.
-
What is the difference between the Midori AI Subsystem and other AI frameworks?
- The Midori AI Subsystem differs from other AI frameworks by providing a modular and extensible platform specifically designed for managing AI workloads, offering features such as simplified deployment, standardized configurations, and isolation for AI systems.
-
How does the Midori AI Subsystem handle security?
- The Midori AI Subsystem does not handle security directly, but it relies on the security features provided by the underlying Docker container technology and the specific AI backends and tools being used.
-
What are the plans for future development of the Midori AI Subsystem?
- The plans for future development of the Midori AI Subsystem include adding new features, improving stability and performance, and expanding the library of supported backends and tools.
Questions from Carly Kay
—– Disclaimer —–
The functionality of this product is subject to a variety of factors that are beyond our control, and we cannot guarantee that it will work flawlessly in all situations. We have taken every possible measure to ensure that the product functions as intended, but there may be instances where it does not perform as expected. Please be aware that we cannot be held responsible for any issues that arise due to the product’s functionality not meeting your expectations. By using this product, you acknowledge and accept the inherent risks associated with its use, and you agree to hold us harmless for any damages or losses that may result from its functionality not being guaranteed.
—– Footnotes —–
*For your safety we have posted the code of this program onto github, please check it out! - Github
**If you would like to give to help us get better servers - Give Support
***If you or someone you know would like a new backend supported by Midori AI Subsystem please reach out to us at [email protected]