Berkay Buğra Gök - bounswe/bounswe2025group2 GitHub Wiki
About Me
Hi, I am Berkay, a junior computer engineering student from Boğaziçi University.
I like building end-to-end projects that blend different fields of computer science. My current interests lie in Deep Learning, Data Science, LLMs and software engineering.
Technical
I have experience with:
Python
Java
C/C++
Assembly flavors
Also, I like working with high-level frameworks to come up with creative solutions.
Consider checking out my portfolio website and the dashboard I created for the earthquakes near Turkey.
Interests
I love reading or watching science-fiction. The gif above is from Blade Runner (1982), which is one of my favorites.
I also like the history of wrist watches and exploring them.
In our fitness app, I have integrated the GROQ API using the official groq Python package to implement an intelligent fitness tutor. This tutor leverages powerful language models hosted by GROQ to provide real-time, personalized coaching and feedback to users. By sending user-generated prompts—such as fitness goals, progress updates, or questions about training and nutrition—to the GROQ endpoint, the tutor generates dynamic and context-aware responses. We configured the integration to ensure fast and efficient communication with the API, allowing for seamless interaction within the app’s chat interface. This enhancement significantly boosts user engagement by delivering AI-driven fitness guidance directly in the app environment.
defget_response_from_groq(message, chat):
# Get the last 3 messages and responsesuser_messages=UserAiMessage.objects.filter(chat=chat).order_by('-created_at')[:3]
ai_responses=AiTutorResponse.objects.filter(chat=chat).order_by('-created_at')[:3]
# Combine and sort messages by created_athistory= []
formsginuser_messages:
history.append({"role": "user", "content": msg.message})
forrespinai_responses:
history.append({"role": "assistant", "content": resp.response})
# Sort by created_at in ascending order (oldest first)history.sort(key=lambdax: x.get('created_at', ''), reverse=False)
# Create the messages array with system message firstmessages= [
{
"role": "system",
"content": "You are a helpful tutor assistant which excells at fitness. Give precise and consise answers to the user. ""Try to be helpful and supportive. If the question is not related to fitness, say you are not able to help with that. ""Finally, give your responses in plain text and don't use markdown formatting."
}
]
# Add historical contextmessages.extend(history)
# Add the current messagemessages.append({"role": "user", "content": message})
client=Groq(
api_key=os.environ.get("GROQ_API_KEY"),
)
chat_completion=client.chat.completions.create(
messages=messages,
model="llama-3.3-70b-versatile",
)
response=chat_completion.choices[0].message.contentreturnresponse
I had already worked on the messaging feature, so besides real time chatting with real people, I added an AI to message with. The groq API is free for many models up to certain limits.
We have decided to conduct our weekly meetings on saturdays from now on, so the weeks will start and end on saturdays as well for my weekly effort table.
I know it looks kind of weird but I was all over the place in this last week. I generally worked on front/back integration related issues and kept the project's state ready to deploy.