Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Try on kisski #84

Open
wants to merge 10 commits into
base: main
Choose a base branch
from
2 changes: 1 addition & 1 deletion .env
Original file line number Diff line number Diff line change
@@ -1 +1 @@
# HF_TOKEN=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
KISSKI_API_KEY='b6c89c0b03ba5bd30170933cc6861886'
Copy link
Member

@haesleinhuepf haesleinhuepf Dec 17, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi Yixin,

never store API keys on Github so that others can read them.

I recommed you to revoke the huggingface and the Kisski key. Otherwise others can use the platforms on your behalf.

https://docs.github.com/en/rest/authentication/keeping-your-api-credentials-secure?apiVersion=2022-11-28#store-your-authentication-credentials-securely

Best,
Robert

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi Robert,

Thanks for your advice. I got a message last night, and in my latest version that I replaced it as "KISSKI_API_KEY=your_api_key" and uploaded it yesterday, but I don't know why you can still read it.

Best,
Yixin

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Because the git history contains it. Again, I recommend to revoke all keys you put on the internet.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you know how to delete them? I tried using the "delete file" function, but it does not seem to work.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I mean to delete the history

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You should revoke the key instead of deleting the history.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

E.g. as shown here: https://huggingface.co/docs/hub/en/security-tokens#how-to-manage-user-access-tokens or by sending an email to the KISSKI folks

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ahh, ok thank you, I will write an email to them to recreate a new one , thank you again

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

They have already created a new one and the HF token has already been deleted.

4 changes: 3 additions & 1 deletion docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,9 @@ services:
environment:
- ELASTICSEARCH_HOST=elasticsearch
- ELASTICSEARCH_PORT=9200
- HF_TOKEN=${HF_TOKEN}
- KISSKI_API_KEY=${KISSKI_API_KEY} # Pass the KISSKI API Key
- USE_GPU=True # optional; for logging or future local usage
- MODEL_NAME=meta-llama-3.1-70b-instruct
depends_on:
elasticsearch:
condition: service_healthy
Expand Down
13 changes: 7 additions & 6 deletions search_engine/chatbot/llm_utilities.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,24 +25,25 @@ def __init__(self, model_name="meta-llama-3.1-70b-instruct", use_gpu=True):
raise EnvironmentError("Please set KISSKI_API_KEY for KISSKI LLM access.")

# Point OpenAI client to the KISSKI Chat AI endpoint
# The KISSKI service is an OpenAI-compatible API with a custom base_url
openai.api_base = "https://chat-ai.academiccloud.de/v1"

logger.info(f"KISSKI LLM configured with model '{self.model_name}'. GPU usage = {self.use_gpu}.")
logger.info(
f"KISSKI LLM configured with model '{self.model_name}'. GPU usage = {self.use_gpu}."
)

def generate_response(self, prompt, max_new_tokens=150, num_return_sequences=1):
"""
Generate a response from the KISSKI LLM service based on the provided prompt.
Generate a response from the KISSKI LLM service using the new openai>=1.0.0 Chat interface.
Args:
prompt (str): The input prompt for the model.
max_new_tokens (int): Maximum tokens to generate in the reply (beyond the prompt).
num_return_sequences (int): Number of response sequences to generate (defaults to 1).
max_new_tokens (int): Maximum tokens to generate in the reply.
num_return_sequences (int): How many responses to return.
Returns:
str: The generated response text from the LLM.
"""
try:
messages = [{"role": "user", "content": prompt}]
response = openai.ChatCompletion.create(
response = openai.chat.completions.create(
model=self.model_name,
messages=messages,
max_tokens=max_new_tokens,
Expand Down
5 changes: 1 addition & 4 deletions search_engine/chatbot/requirements_chatbot.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,4 @@ flask
flask-cors
pyyaml
requests
transformers
optimum[openvino]
accelerate
openai
openai==1.57.4