Skip to main content

LIA - Oral Presentation Guide

Presentation 15 min + 5 min Q&A Advanced

Presentation Overview

Your oral presentation is worth 15% of your LIA grade. You have 15 minutes to present your project, followed by 5 minutes of questions from the instructor. This is your opportunity to demonstrate not only that your system works, but that you understand every decision you made.

View Presentation Flow

Time Allocation

Stick to this time breakdown. Practicing with a timer is essential.

#SectionDurationSlidesKey Content
1Title & Introduction1 min1-2Project name, problem statement, your name
2Problem & Dataset1 min1Business context, dataset stats, objectives
3Model Training & Results3 min2-3EDA highlights, models compared, metrics table, best model justification
4API Architecture2 min1-2Endpoints, validation, error handling, framework choice
5Live Demo3 min0 (live)Health check, valid prediction, invalid input, model-info
6Testing1.5 min1Test summary, coverage %, Postman highlights
7Explainability1.5 min1-2SHAP/LIME visualizations, key insights
8Conclusion & Lessons2 min1-2Summary, lessons learned, what you'd improve
Total Presentation15 min10-14
9Q&A5 min0Answer instructor questions
Time Management

Going over 18 minutes or under 12 minutes will cost you points. Practice with a stopwatch. If you're running long, cut from the model training details — the demo and explainability are more impressive.


Slide-by-Slide Guide

Slide 1: Title Slide

ElementContent
TitleYour project name (e.g., "Customer Churn Prediction API")
SubtitleLIA — AI Model Deployment
Your nameFirst and last name
DatePresentation date
CourseCourse code and name
First Impression

Start strong. Your title slide should be clean with no clutter. State your project name and jump straight into the problem. Don't spend time on "Thank you for being here" or "Today I will present..." — the audience knows why you're there.

Slide 2: Problem & Context (1 min)

Content to include:

  • What problem does your model solve?
  • Who benefits from this prediction?
  • Why does this matter? (business value)
  • Dataset: name, size, source (1-2 bullet points)

Example:

"Telecom companies lose millions annually to customer churn. This project predicts which customers are likely to leave, enabling proactive retention campaigns. I used the Telco Customer Churn dataset — 7,043 customers with 20 features."

Slides 3-4: Model Training (3 min)

Content to include:

  • 1-2 key EDA insights (show a chart, not a wall of text)
  • Models compared (table with metrics)
  • Best model and why you chose it
  • One interesting finding from training
ModelAccuracyF1-ScoreAUC-ROC
Logistic Regression0.800.560.84
Random Forest0.790.540.83
XGBoost0.810.630.87

"I selected XGBoost because it has the best AUC-ROC (0.87) and F1-Score (0.63), which matters more than accuracy for this imbalanced dataset."

Slide 5: API Architecture (2 min)

Content to include:

  • Framework choice and justification
  • Endpoint table (keep it simple)
  • One example request/response
  • How validation and error handling work
View API Architecture

Slides 6-7: Live Demo (3 min)

This is the most impactful part of your presentation. Show your API working in real-time.

Demo sequence (recommended order):

#ActionWhat to ShowExpected Result
1GET /healthAPI is running{ "status": "healthy" }
2GET /model-infoModel metadataModel name, version, metrics
3POST /predict (valid)Real predictionPrediction + confidence
4POST /predict (invalid)Error handling422 with validation error message
5GET /docsSwagger UIInteractive API documentation
Demo Best Practices
  • Pre-start your API before the presentation begins. Don't waste 30 seconds running uvicorn.
  • Use Swagger UI (/docs) for the demo — it's visual and impressive.
  • Prepare your requests in advance (bookmarked browser tabs or pre-filled Swagger forms).
  • Have a backup plan: If the demo fails, show screenshots or a pre-recorded video.
Demo Disaster Prevention

Test your demo on the presentation computer before presenting. Common failures:

  • Port already in use (change port)
  • Missing dependencies (use requirements.txt)
  • Model file not found (check relative paths)
  • Firewall blocking connections

Slide 8: Testing (1.5 min)

Content to include:

  • Number of tests and test types
  • Coverage percentage
  • One Postman screenshot (optional)
  • One interesting edge case you discovered

Example:

"I wrote 14 automated tests — 5 unit tests, 6 integration tests, and 3 edge case tests. Code coverage is 82%. I discovered that sending negative values for tenure caused an unhandled error, which led me to add input range validation."

Slides 9-10: Explainability (1.5 min)

Content to include:

  • Method used (LIME, SHAP, or both)
  • One global feature importance chart (SHAP summary plot is ideal)
  • One individual prediction explanation
  • Key insight in plain language

Example narrative:

"Using SHAP, I found that the three most important features are: contract type, monthly charges, and tenure. Month-to-month customers are 3x more likely to churn. Here's a waterfall plot for a specific high-risk customer — you can see that their short tenure and high monthly charges push the prediction strongly toward churn."

Slides 11-12: Conclusion (2 min)

Content to include:

  • Summary of achievements (3-4 bullet points)
  • 2-3 lessons learned (be honest and reflective)
  • 2-3 future improvements
  • Final word

Example conclusion:

"In summary, I built a complete AI prediction service — from data exploration to a deployed, tested, explainable API. Key lessons: data quality matters more than model complexity, and writing tests early saves time. If I had more time, I'd add Docker deployment and monitoring with Prometheus."


Handling Q&A (5 minutes)

The Q&A tests whether you truly understand your project. The instructor will ask 3-5 questions.

Common Questions to Prepare For

Detailed Question Bank

Model Training Questions
QuestionWhat They're Testing
Why did you choose [model] over [other model]?Can you justify decisions with metrics and reasoning?
How did you handle class imbalance?Do you know techniques like SMOTE, class weights?
What would happen if you used more/less features?Do you understand feature selection?
Is your model overfitting? How do you know?Can you interpret train vs test performance?
Why did you use [metric] as your primary metric?Do you understand metric trade-offs?
What is cross-validation and did you use it?Do you understand evaluation methodology?
How would you retrain the model with new data?Do you think about the production lifecycle?
API Design Questions
QuestionWhat They're Testing
Why FastAPI instead of Flask (or vice versa)?Can you compare frameworks?
How and when is the model loaded?Do you understand startup vs. per-request loading?
What happens if the model file is missing?Do you handle edge cases?
How does Pydantic validation work?Do you understand your validation layer?
What HTTP status code for [scenario]?Do you know REST conventions?
How would you handle concurrent requests?Do you understand concurrency?
How would you add authentication?Do you think about security?
Testing Questions
QuestionWhat They're Testing
What is the difference between unit and integration tests?Do you understand test types?
What does 70% code coverage mean?Do you understand coverage metrics?
Is 100% coverage always the goal?Do you understand testing limitations?
What is a test fixture in pytest?Do you understand your testing framework?
How did you test error handling?Did you test the unhappy paths?
What did Postman tests verify that pytest didn't?Do you understand complementary testing?
Explainability Questions
QuestionWhat They're Testing
What is the difference between LIME and SHAP?Do you understand both methods?
What does a SHAP value represent?Can you explain the math simply?
Did you find any unexpected feature importance?Did you think critically about results?
Could there be data leakage in your features?Do you understand this critical concept?
Is your model fair? How would you check?Do you think about bias and ethics?
What are the limitations of your explainability analysis?Do you understand method limitations?

Q&A Strategy

DoDon't
Listen to the full question before answeringInterrupt the questioner
Say "That's a great question" to buy thinking timeSay "I don't know" and stop there
Admit if you don't know, then hypothesizeMake up an answer you're not sure about
Relate answers back to your specific projectGive generic textbook answers
Be concise — 30-60 seconds per answerRamble for 3 minutes
When You Don't Know the Answer

It's okay to not know everything. A strong response is: "I haven't explored that specifically, but based on what I know about [related concept], I would approach it by [hypothesis]." This shows critical thinking even when you lack the specific knowledge.


Presentation Design Tips

Slide Design Rules

RuleGoodBad
Text per slide4-6 bullet points, key wordsFull paragraphs copied from report
Font size≥ 24pt for body, ≥ 32pt for titles12pt text that nobody can read
VisualizationsCharts, diagrams, screenshotsWalls of text
ColorsConsistent palette, readable contrastRainbow colors, light text on light background
Code on slidesShort snippets (5-10 lines max)Full source files
AnimationsNone or minimalFlying text and spinning transitions

Content Strategy

Storytelling Approach

Think of your presentation as a story, not a report:

  1. Hook — Why should the audience care? (Introduction)
  2. Challenge — What was hard? (Training, data issues)
  3. Solution — How did you solve it? (API, testing)
  4. Proof — Does it work? (Demo, explainability)
  5. Reflection — What did you learn? (Conclusion)

What NOT To Do

Presentation Anti-Patterns

Anti-PatternWhy It's BadWhat To Do Instead
Reading from slidesShows you don't know the materialUse slides as prompts, speak naturally
Showing all your codeBoring, impossible to readShow 1-2 key snippets, demo the rest
Skipping the demoThe demo is 20% of the impressionAlways demo, even if briefly
No eye contactSeems unconfidentLook at the audience, not the screen
Apologizing ("Sorry, I know this isn't perfect")Undermines your workPresent confidently, mention improvements in future work
Running over timeDisrespectful, loses pointsPractice with a timer
Live coding during demoToo risky, wastes timeHave everything running before you start
Unexplained jargonLoses the audienceDefine terms briefly if needed
No backup planDemo failures are devastatingHave screenshots or a video ready
Thanking everyone at the startWastes 30 secondsStart with the problem statement

Preparation Checklist

One Week Before

  • Slide deck is complete (10-14 slides)
  • All visualizations and charts are included
  • Demo script is written (what you'll show, in what order)
  • First practice run done (check timing)

Three Days Before

  • Practice run #2 with timer (aim for 14-15 minutes)
  • Q&A questions reviewed and answers prepared
  • Demo tested on the presentation computer (or similar setup)
  • Backup screenshots saved in slides (in case demo fails)

Day Before

  • Final practice run (ideally in front of someone)
  • API tested and working
  • Slide deck exported to PDF (backup format)
  • All files on USB drive or accessible cloud storage

Day Of

  • Arrive early, set up equipment
  • Start your API before the presentation begins
  • Open Swagger UI in a browser tab
  • Open your slide deck
  • Take a deep breath — you've prepared well

Grading Rubric — Oral Presentation

CriterionWeightExcellent (14-15)Good (12-13)Satisfactory (11)Insufficient (< 11)
Content Completeness25%All components covered: model, API, tests, explainabilityMost components coveredSome components missingMajor components missing
Technical Depth20%Deep understanding, justifies all decisionsGood understanding, most decisions explainedSurface-level understandingCannot explain decisions
Live Demo20%Flawless demo, multiple scenarios shownDemo works with minor hiccupsDemo partially worksDemo fails or not attempted
Communication15%Confident, clear, good pace, engages audienceGood delivery, minor nervousnessAdequate but monotone or rushedUnclear, reading from slides
Slides Quality10%Clean, visual, informative, professionalGood design, minor issuesAcceptable but text-heavyPoor design, hard to read
Q&A Performance10%All questions answered confidentlyMost questions answered wellSome questions answeredCannot answer basic questions

Example Presentation Outline

Here is a concrete example for a Customer Churn Prediction project:

SlideTitleContent
1Customer Churn Prediction APITitle, name, date, course
2The Problem"26% of customers churn quarterly — $2M annual loss"
3Dataset & EDATelco dataset, 7043 samples, class distribution chart
4Model ComparisonTable: LogReg vs RF vs XGBoost, metrics highlighted
5Best Model: XGBoostAUC-ROC=0.87, confusion matrix, why XGBoost won
6API ArchitectureDiagram: Client → FastAPI → Model → Response
7Endpoints & ValidationTable of endpoints, Pydantic example
8-9LIVE DEMOSwagger UI: /health → /predict → /model-info → error case
10Testing Results14 tests, 82% coverage, key edge case found
11SHAP AnalysisSummary plot: top 3 features explained
12Individual ExplanationWaterfall plot for one high-risk customer
13Conclusion & LessonsSummary, 3 lessons learned, 2 future improvements
14Thank You / QuestionsContact info, repository link