~/local-llm-playground / 07_prompt $
airgap_ok gpu: WebGPU

open ./07_prompt.html

Same user message, three different system prompts. See how framing changes everything — the most transferable skill in AI.

setup // ollama gives more consistent results for this exercise
Local runs columns sequentially (WebGPU limit) · Ollama runs all three in parallel
Select gemma3-270m-it-q4_0-web.task to begin.
user_message // ctrl+enter to generate
sys_prompt_a
tokens: 0 time: 0.0s speed:
Output will appear here…
sys_prompt_b
tokens: 0 time: 0.0s speed:
Output will appear here…
sys_prompt_c
tokens: 0 time: 0.0s speed:
Output will appear here…