Midori AI Subsystem
Subsystem and Manager are still in beta!
For Issues, Please open a PR on this github - https://github.com/lunamidori5/Midori-AI
Subsystem and Manager are still in beta!
For Issues, Please open a PR on this github - https://github.com/lunamidori5/Midori-AI
How Docker Works
Docker is a containerization platform that allows you to package and run applications in isolated and portable environments called containers. Containers share the host operating system kernel but have their own dedicated file system, processes, and resources. This isolation allows applications to run independently of the host environment and each other, ensuring consistent and predictable behavior.
Midori AI Subsystem - Github Link
The Midori AI Subsystem extends Docker’s capabilities by providing a modular and extensible platform for managing AI workloads. Each AI system is encapsulated within its own dedicated Docker image, which contains the necessary software and dependencies. This approach provides several benefits:
Warnings / Heads up
Known Issues
Windows Users
Should you be missing this prerequisite, the manager is capable of installing it on your behalf. Docker Desktop Windows
Please make a folder for the Manager program with nothing in it, do not use the user folder.
subsystem_manager.exe
Open a Command Prompt or PowerShell terminal and run:
curl -sSL https://raw.githubusercontent.com/lunamidori5/Midori-AI-Subsystem-Manager/master/model_installer/shell_files/model_installer.bat -o subsystem_manager.bat && subsystem_manager.bat
Open a Command Prompt or PowerShell terminal and run:
curl -sSL https://tea-cup.midori-ai.xyz/download/model_installer_windows.zip -o subsystem_manager.zip
powershell Expand-Archive subsystem_manager.zip -DestinationPath .
subsystem_manager.exe
If these prerequisites are missing, the manager can install them for you on Debian or Arch-based distros. Docker Engine and Docker Compose
or
curl -sSL https://raw.githubusercontent.com/lunamidori5/Midori-AI-Subsystem-Manager/master/model_installer/shell_files/model_installer.sh > model_installer.sh && bash ./model_installer.sh
Open a terminal and run:
curl -sSL https://tea-cup.midori-ai.xyz/download/model_installer_linux.tar.gz -o subsystem_manager.tar.gz
tar -xzf subsystem_manager.tar.gz
chmod +x subsystem_manager
sudo ./subsystem_manager
Unraid is not fully supported by the Subsystem Manager, We are working hard to fix this, if you have issues please let us know on the github.
Download and set up Docker Compose Plugin
Click on the settings
gear icon, then click the compose file
menu item
After that copy and paste this into the Docker Compose Manager plugin
You may need to edit the mounts to the left of the :
CPU Only:
services:
midori_ai_unraid:
image: lunamidori5/subsystem_manager:master
ports:
- 39090:9090
privileged: true
restart: always
tty: true
volumes:
- /mnt/user/appdata/MidoriAI/system:/var/lib/docker/volumes/midoriai_midori-ai/_data
- /mnt/user/appdata/MidoriAI/models:/var/lib/docker/volumes/midoriai_midori-ai-models/_data
- /mnt/user/appdata/MidoriAI/images:/var/lib/docker/volumes/midoriai_midori-ai-images/_data
- /mnt/user/appdata/MidoriAI/audio:/var/lib/docker/volumes/midoriai_midori-ai-audio/_data
- /var/run/docker.sock:/var/run/docker.sock
CPU and Nvidia GPU:
services:
midori_ai_unraid:
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: 1
capabilities: [gpu]
image: lunamidori5/subsystem_manager:master
ports:
- 39090:9090
privileged: true
restart: always
tty: true
volumes:
- /mnt/user/appdata/MidoriAI/system:/var/lib/docker/volumes/midoriai_midori-ai/_data
- /mnt/user/appdata/MidoriAI/models:/var/lib/docker/volumes/midoriai_midori-ai-models/_data
- /mnt/user/appdata/MidoriAI/images:/var/lib/docker/volumes/midoriai_midori-ai-images/_data
- /mnt/user/appdata/MidoriAI/audio:/var/lib/docker/volumes/midoriai_midori-ai-audio/_data
- /var/run/docker.sock:/var/run/docker.sock
Start up that docker then run the following in it by clicking console
python3 subsystem_python_runner.py
Do not use on windows
Please make a folder for the Manager program with nothing in it, do not use the user folder.
Download this file
curl -sSL https://raw.githubusercontent.com/lunamidori5/Midori-AI-Subsystem-Manager/master/midori_ai_manager/subsystem_python_runner.py > subsystem_python_runner.py
Open a terminal and run:
python3 subsystem_python_runner.py
Open a terminal and run:
sudo python3 subsystem_python_runner.py
Reminder to always use your computers IP address not localhost
when using the Midori AI Subsystem!
If you encounter any issues or require further assistance, please feel free to reach out through the following channels:
The functionality of this product is subject to a variety of factors that are beyond our control, and we cannot guarantee that it will work flawlessly in all situations. We have taken every possible measure to ensure that the product functions as intended, but there may be instances where it does not perform as expected. Please be aware that we cannot be held responsible for any issues that arise due to the product’s functionality not meeting your expectations. By using this product, you acknowledge and accept the inherent risks associated with its use, and you agree to hold us harmless for any damages or losses that may result from its functionality not being guaranteed.
*For your safety we have posted the code of this program onto github, please check it out! - Github
**If you would like to give to help us get better servers - Give Support
***If you or someone you know would like a new backend supported by Midori AI Subsystem please reach out to us at [email protected]
3000
now 33000
)Here is a link to AnythingLLM Github
Type yes
or no
into the menu
Type in anythinllm
into menu, then hit enter
Enjoy your new copy of AnythingLLM its running on port 33001
localhost
192.168.10.10:33001
or 192.168.1.3:33001
If you need help, please reach out on our Discord / Email; or reach out on their Discord.
Here is a link to Big-AGI Github
Type 2
into the main menu
Type yes
or no
into the menu
Type in bigagi
into menu, then hit enter
Enjoy your new copy of Big-AGI its running on port 33000
localhost
192.168.10.10:33000
or 192.168.1.3:33000
If you need help, please reach out on our Discord / Email; or reach out on their Discord.
Here is a link to LocalAI Github
This guide will walk you through the process of installing LocalAI on your system. Please follow the steps carefully for a successful installation.
2
to begin the installation process.yes
or no
to proceed with GPU support or CPU support only, respectively.localai
into the menu and press Enter to start the LocalAI installation.38080
.localhost
when accessing LocalAI. For example, you would use 192.168.10.10:38080/v1
or 192.168.1.3:38080/v1
depending on your network configuration.If you encounter any issues or require further assistance, please feel free to reach out through the following channels:
5
to Enter the Backend Program Menu10
to Enter the LocalAI Model Installerlocalai-api-1
, but not always. If you need help, reach out on the Midori AI Discord / Email.yes
.no
.Need help on how to do that? Stop by - How to send OpenAI request to LocalAI
5
to Enter the Backend Program Menu10
to Enter the LocalAI Model Installerlocalai-api-1
, but not always. If you need help, reach out on the Midori AI Discord / Email.yes
.no
.huggingface
when asked what size of model you would like.https://huggingface.co/mlabonne/gemma-7b-it-GGUF/resolve/main/gemma-7b-it.Q2_K.gguf?download=true
mlabonne/gemma-7b-it-GGUF/gemma-7b-it.Q2_K.gguf
Need help on how to do that? Stop by - How to send OpenAI request to LocalAI
Here is a link to InvokeAI Github
This guide provides a comprehensive walkthrough for installing InvokeAI on your system. Please follow the instructions meticulously to ensure a successful installation.
2
to access the “Installer/Upgrade Menu”.yes
.invokeai
and pressing Enter.5
to access the “Backend Programs Menu”.enter
Note: The installation process may appear inactive at times; however, rest assured that progress is being made. Please refrain from interrupting the process to ensure its successful completion.
Enjoy using InvokeAI! For additional help or information, please refer to the following resources: