Throughout the Bad At Computer website, I aim for a consistent design language. The monospaced bitmap fonts, 90s color schemes, and Super Nintendo sprites add up to a feeling of millennial-nostalgia. I had the notion to build more solutions with AI, so I decided to combine two goals: learn how to integrate AI while speeding up my site design tasks.

Description
The primary feature of this .NET 8 Blazor Web App is a predefined text prompt which is modified via user controls. This modified prompt is then sent via API call to the Dall-E 3 text-to-image model hosted in Azure AI Foundry. The response from Dall-E contains the image URL, which is dropped into the app.
Inspiration
I believe generative AI overwhelm is real and that it can be solved in part by providing human-friendly UI controls like radio buttons, character limits, and color-pickers. This reduces the number of choices a user needs to make while still getting results that meet the desired design language. We allow the user to tweak the process to get the perfect result. This can be valuable for establishing a consistent brand identity for a business, blog, or artistic work.
During a recent seminar that I attended, I learned about Azure AI Foundry. This is a Microsoft service that provides hundreds of AI models for use in apps like this one, at low cost. These models are tried, tested, and secure, as opposed to the wild-west of open-source LLMs. For example, content is moderated, there are fewer hallucinations, and training data complies with whatever the current regulations are.
The lecture left me inspired to integrate AI into an app. It’s actually my first project that uses an API for an LLM, and building it was easier than I expected thanks to Azure AI Foundry. The app also makes calls to api.color.pizza to grab basic color names used for styling the logos.
Technologies Used
- Blazor Web App
- MudBlazor
- ASP.NET Core, C#
- Dall-E 3
- Azure AI Foundry
- Azure App Service (deployment)
- Web APIs (api.color.pizza)
- JetBrains Rider