Expanding My GPT App: Introducing ChatGPT Deutsch – A German-Language Frontend for Structured AI Conversations

First of all, I apologize to the admin and everyone because this article is a bit advertising.
Following up on the feedback assistant I posted earlier, I’ve been working on a related project that takes the core idea further. It’s called ChatGPT Deutsch, and it’s aimed at enabling structured, native-language interaction with GPT for German-speaking users.

While the initial app focused on summarizing survey feedback, this one is more of a general-purpose assistant — but designed specifically for German input/output and structured tasks like drafting emails, clarifying documents, or even generating responses in certain styles (e.g., formal/informal, technical/neutral, etc.).

Why I built it

A few non-English-speaking users (friends and internal testers) were hesitant to interact with “just a chatbot”, especially when they had to think in English. So I thought: what if I wrap GPT in a UI that not only speaks German fluently but also guides the conversation around common tasks? That led to a much more usable and predictable experience.

What it does

  • Provides several “modes” (e.g. summarize, explain, translate, rewrite)
  • Users can choose tone/formality (Du/Sie, locker/formell, etc.)
  • Keeps prompt logic in the backend for consistency
  • Includes logging to track performance (and prompt success/failures)

How I built it

  • The frontend is 100% Anvil — I reused parts of the interface from the feedback tool
  • Server modules manage structured prompts and API calls to OpenAI
  • Uplink handles scheduled tasks and batch runs (still testing background task alternatives)
  • All content is dynamically switched based on user language (German/English fallback)

Questions I’m still working through:

  • What’s the best way to store reusable prompt templates for different tasks?
  • Has anyone integrated a proper stateful conversation (multi-turn memory) with GPT in Anvil?
  • Is it worth using a caching layer to avoid duplicate prompt calls for the same input?

I’d love to hear how others have handled similar cases, especially when localizing AI interfaces or dealing with prompt complexity.

Happy to share a demo or the component structure if anyone’s interested.

3 Likes

Can’t speak for everyone but I love it when people advertise their Anvil app and even more exciting if it powers a business!

2 Likes

I have no problem either with showing solution. Would be also interested how you manage to get costs covered and what every you can show us from your solution to help us implement in ours. Cheers Aaron

3 Likes

For what it’s worth, we are very happy if you’re advertising any Anvil-built app in the Show and Tell forum - from small hobby projects to large commercial enterprises. We love seeing what people are building!

Especially when you walk us through how you built it - it’s super helpful for us to see, and it can benefit the whole community too.

2 Likes

At the moment I’m just covering the OpenAI costs myself since usage is still fairly low (mostly internal testing and some small pilots). If this grows, I might look at either limiting requests per user or adding a simple subscription model.

I can share a simplified version of the Anvil server module and how I’m structuring the prompts — that should be enough to adapt for your own use without exposing any keys. I’ll put together a minimal example and post it soon.

thank you, sound great, looking forward to seeing and I think as well there has to be some subscriptiom