SelfHostLLM - GPU Memory Calculator for LLM Inference