Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

A bit of QoL ♥ for the eyes. #424

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
47 changes: 26 additions & 21 deletions colab/GPU.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,8 @@
},
"outputs": [],
"source": [
"#@title <-- Tap this if you play on Mobile { display-mode: \"form\" }\n",
"#@title { display-mode: \"form\" }\n",
"#@markdown <h2><-- Tap <em>this</em> <b>first</b> if you're on a mobile device.</h2>\n",
"%%html\n",
"<b>Press play on the music player to keep the tab alive, then start KoboldAI below (Uses only 13MB of data)</b><br/>\n",
"<audio src=\"https://raw.githubusercontent.com/KoboldAI/KoboldAI-Client/main/colab/silence.m4a\" controls>"
Expand All @@ -63,7 +64,8 @@
},
"outputs": [],
"source": [
"#@title <b><-- Select your model below and then click this to start KoboldAI</b>\n",
"# @title <h3><b>Select your model below and then:</b></h3> { display-mode: \"form\" }\n",
"#@markdown <h2><----- Click this button to start KoboldAI.</b></h2><br>\n",
"#@markdown You can find a description of the models below along with instructions on how to start KoboldAI.\n",
"\n",
"Model = \"Nerys V2 6B\" #@param [\"Tiefighter 13B (United)\", \"Echidna 13B (United)\", \"HoloMax 13B (United)\", \"Emerhyst 13B (United)\", \"MythoMax 13B (United)\", \"Huginn 13B (United)\", \"Chronos 13B (United)\", \"Airoboros M2.0 13B (United)\", \"Holodeck 13B (United)\", \"Spring Dragon 13B (United)\", \"Nerys V2 6B\", \"Skein 6B\", \"Janeway 6B\", \"Adventure 6B\", \"Nerys 2.7B\", \"AID 2.7B\", \"Janeway 2.7B\", \"Picard 2.7B\", \"OPT 2.7B\", \"Fairseq Dense 2.7B\", \"Neo 2.7B\"] {allow-input: true}\n",
Expand All @@ -72,16 +74,18 @@
"Provider = \"Cloudflare\" #@param [\"Localtunnel\", \"Cloudflare\"]\n",
"use_google_drive = True #@param {type:\"boolean\"}\n",
"\n",
"!!rm -rf /content/sample_data/\n",
"import os\n",
"if not os.path.isfile(\"/opt/bin/nvidia-smi\"):\n",
" from google.colab import runtime\n",
" raise RuntimeError(\"⚠️Colab did not give you a GPU due to usage limits, this can take a few hours before they let you back in. Check out https://lite.koboldai.net for a free alternative (that does not provide an API link but can load KoboldAI saves and chat cards) or subscribe to Colab Pro for immediate access.⚠️\")\n",
" runtime.unassign()\n",
"\n",
"!nvidia-smi\n",
"from google.colab import drive\n",
"if use_google_drive:\n",
" drive.mount('/content/drive/')\n",
" from google.colab import drive\n",
" drive.mount('/content/drive', force_remount=True)\n",
"else:\n",
" import os\n",
" if not os.path.exists(\"/content/drive\"):\n",
" os.mkdir(\"/content/drive\")\n",
" if not os.path.exists(\"/content/drive/MyDrive/\"):\n",
Expand Down Expand Up @@ -192,7 +196,20 @@
"else:\n",
" tunnel = \"\"\n",
"\n",
"!wget https://koboldai.org/ckds -O - | bash /dev/stdin -m $Model -g $Version $Revision $tunnel"
"!!wget https://koboldai.org/ckds -O - | bash /dev/stdin -m $Model -g $Version $Revision $tunnel"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "5k8fK4F6UiTs"
},
"outputs": [],
"source": [
"# @title <b>Model Cleaner</b> { display-mode: \"form\" }\n",
"#@markdown Out of space? Run this to remove all cached model downloads. (Google Drive models are not affected.)\n",
"!!rm -rf /content/KoboldAI-Client/cache/*\n"
]
},
{
Expand Down Expand Up @@ -240,20 +257,6 @@
"\n",
"Get a error message saying you do not have access to a GPU/TPU instance? Do not continue and try again later, KoboldAI will not run correctly without them."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"cellView": "form",
"id": "5k8fK4F6UiTs"
},
"outputs": [],
"source": [
"#@title <b>Model Cleaner</b>\n",
"#@markdown Out of space? Run this to remove all cached models (Google Drive models are not effected).\n",
"!rm -rf /content/KoboldAI-Client/cache/*\n"
]
}
],
"metadata": {
Expand All @@ -262,7 +265,9 @@
"name": "ColabKobold GPU",
"private_outputs": true,
"provenance": [],
"include_colab_link": true
"include_colab_link": true,
"gpuType": "T4",
"cell_execution_strategy": "setup"
},
"kernelspec": {
"display_name": "Python 3",
Expand Down