Skip to content

Commit

Permalink
If podman or docker is available and we are not in a container
Browse files Browse the repository at this point in the history
We should re-exec this script inside a llama.cpp container (incomplete
in this commit), truth is these AI runtime evironment are complex to
set up by say installing a few packages, etc. Although if one does want
to run outside a container and or say they are running on macOS we
should support that.

Signed-off-by: Eric Curtin <[email protected]>
  • Loading branch information
ericcurtin committed Jul 27, 2024
1 parent 61cf698 commit 22ab23b
Showing 1 changed file with 20 additions and 0 deletions.
20 changes: 20 additions & 0 deletions ramalama
Original file line number Diff line number Diff line change
Expand Up @@ -160,6 +160,26 @@ def get_ramalama_store():
return os.path.expanduser("~/.local/share/ramalama")


def in_container():
if os.path.exists("/run/.containerenv") or os.path.exists("/.dockerenv") or os.getenv("container"):
return True

return False


def available(command):
return shutil.which(command) is not None


def select_container_manager():
if available("podman"):
return "podman"
elif available("docker"):
return "docker"

return ""


def main():
if len(sys.argv) < 2:
usage()
Expand Down

0 comments on commit 22ab23b

Please sign in to comment.