Deep Hat (formerly WhiteRabbitNeo) is a model series that can be used for offensive and defensive cybersecurity.

If you’d rather consume it via Hugging Face’s inference provider service on your ParrotOS box or other Linux distro it would look something like this.

Deep Hat nmap

Hugging Face allots a free $0.10 to free tier accounts to get started, which is enough to test and get familiar with before getting dangerous ;)

Install Silly Tavern. The install instructions work verbatim.

Once it is running use the default of “User” and then click the plug icon to configure the API. This will require a token to be created with only the “Make calls to Inference Providers” checked for your Hugging Face account.

Select Custom (OpenAI-compatible) for Chat completion source For custom Custom Endpoint (Base URL) enter “https://router.huggingface.co/v1" and enter your API key under Custom API key. Enter Model ID “DeepHat/DeepHat-V1-7B:featherless-aii” and select None under Available models. Select Semi-strict (alternating roles; no tools) Then click connect, click the background to close the settings menu and start chatting away with Deep Hat!

To note this uses the featherless.ai inference provider, which runs the model with 8K quantization.

Silly Tavern Deep Hat config