From prompt engineering fundamentals to autonomous agent systems — every concept practiced hands-on, every tool used on real problems.
“Before you can build with AI, you need to speak its language.”
Five days of structured practice: from zero to writing system prompts that control tone, format, and reasoning depth. Students work inside Claude and OpenAI Playground, learning temperature, top-p, and stop sequences hands-on — not from slides. By Friday, everyone has built a custom AI assistant with memory and personality.
Not every model needs a cloud. This module covers running language models on your own hardware — when it makes sense, how to set it up, and what you trade off in speed, privacy, and capability. Students install Ollama, pull quantized models, and benchmark inference on consumer GPUs.
“The real skill isn't using AI tools — it's knowing which problem to point them at.”
This week bridges the gap between technology and business thinking. Students identify real use cases, build a business plan with AI support, develop marketing assets with generative tools, and present a startup pitch — complete with KI-generated visuals, a value proposition, and market analysis powered by Tree of Thoughts prompting and Porter's Five Forces.
A hands-on comparison of tools that write, format, and optimize job applications. Students test each tool with their own CV and measure the output quality. The module ends with a critical analysis: where does AI save time, and where does human judgment still win?
From Perplexity deep-dives to AI-driven competitive analysis. Students learn to use search-augmented generation for market research, build a Business Model Canvas with AI assistance, and develop a value proposition using the Ventures Kit.
Canva, Adobe Firefly, Ideogram, Piktochart — the tools are easy. The hard part is developing a visual language that works across channels. Students create a cohesive brand kit: logo, social media templates, infographic, and presentation deck.
Understanding diffusion models at a conceptual level, then applying them. Students compare Midjourney, DALL-E 3, Stable Diffusion, and Runway across the same creative brief — prompting strategies, negative prompts, style transfer, and the current limits of generative consistency.
“Data without decisions is noise. Automation without understanding is fragile.”
Week three makes the abstract concrete: students process real datasets in Python and Julius.ai, build dashboards that tell a story, and wire up their first Make.com automations end-to-end. The capstone exercise: design, build, and present a newsletter automation that runs without human intervention.
The mathematical intuition behind every AI tool on the market. Not formulas on a whiteboard — but understanding why a model predicts what it predicts. Students train a simple classifier in Google Colab and evaluate it with real metrics.
Make.com from first principles. Students build three scenarios of increasing complexity: a simple webhook listener, a multi-step data transformation pipeline, and a full newsletter automation with conditional logic — everything needed to replace manual workflows.
How to learn faster with AI as a research partner. Students explore NotebookLM for source synthesis, Consensus for scientific paper search, and build a personal knowledge management system that turns unstructured reading into structured insights.
“The final week is where everything converges — code, models, and autonomous systems.”
Students go from consuming AI to building with it. They host models on Hugging Face, prototype apps with v0 and Lovable, understand MCP (Model Context Protocol), and build an agentic workflow that chains multiple AI steps into an autonomous system. The course ends with 'MCCannes' — a creative challenge where teams produce a short AI-generated film.
The GitHub of machine learning. Students navigate the model hub, test models in Spaces, deploy one via the Inference API, and embed it into a simple application. The module demystifies model cards, licensing, and the difference between a 7B and 70B parameter model.
From single-turn prompts to multi-step autonomous systems. Students build an agent that reasons about a task, breaks it into subtasks, executes them sequentially, and validates its own output — and learn when an agent should ask for human approval.
AI-assisted coding — what the industry calls 'vibe coding.' Students build a working prototype in v0, extend it in Lovable, and learn how Claude Code turns natural language into full-stack applications. The module includes a live app challenge: teams have 90 minutes to go from idea to deployed prototype.
From landing page to deployed web application — without writing code by hand. Students use AI tools to generate, iterate, and deploy websites. The module answers the key question: when is no-code enough, and when do you need a developer?
Text-to-speech, voice cloning, lip-sync avatars, and automated video production. Students clone their own voice with ElevenLabs, create a talking-head video with HeyGen, and build a complete video pipeline. The 'MCCannes' challenge uses these tools for a creative short film.
One of the HTML presentations built for the course — fully interactive. Use the arrows, the dots below, or your keyboard.