Assessment and Rubric Design Guideline for EEE Courses

Assessment and Rubric Design Guideline for EEE Courses

Assessment and Rubric Design Guideline for EEE Courses

Department of Electrical and Electronic Engineering, BUET · Outcome Based Education

Assessment and Rubric Design Guideline for EEE Courses

This page provides a practical guide for designing CO-aligned assessments, compulsory final-exam questions, laboratory and project rubrics, course-file evidence, and attainment-support tables for BUET EEE courses. It preserves the fixed BUET EEE theory-course assessment structure and provides flexible rubric-based guidance for sessional/laboratory courses.

Fixed theory structure: 10% + 20% + 70% Section A/B compulsory CO questions Lab and project rubric library Offline JavaScript tools Print-friendly course-file support

Disclaimer and scope of use

This page is an internal academic support guide for the Department of Electrical and Electronic Engineering, BUET. It is intended to help faculty prepare CO-aligned assessments, rubrics, course-file evidence, and attainment-support documents in a consistent OBE format.

The guidance, templates, rubrics, and browser-based tools provided here are recommended practice unless explicitly stated otherwise in an approved BUET, EEE Department, BAETE, or other official accreditation document. They do not replace the official BUET academic ordinances, departmental decisions, course committee decisions, examination rules, BAETE manuals, or any formally approved departmental attainment-calculation policy.

For theory courses, this page preserves the fixed BUET EEE assessment structure: class participation 10%, continuous assessment/class tests 20%, and term final examination 70%. The interactive calculators and generated tables are provided only as planning and documentation aids. Final CO/PO attainment calculations, evidence sampling, moderation procedures, and CQI decisions should follow the current approved policy of the department and the university.

1. Page purpose and target users

The purpose of this page is to help BUET EEE faculty design defensible assessment evidence for Course Outcomes (COs) and Program Outcomes (POs), while respecting the department’s established theory-course assessment structure. The guide is public-facing so that students and stakeholders can understand how learning is assessed, but the templates and tools are designed for faculty preparing course outlines, question papers, rubrics, course files, and accreditation evidence.

Faculty members

Prepare CO-aligned assessment plans, class tests, final-exam blueprints, rubrics, and evidence tables.

Course coordinators

Check balance across sections, COs, Bloom levels, and evidence sources before finalizing course files.

Paper setters and moderators

Use compulsory questions and marking schemes as direct CO evidence without overclaiming PO mapping.

Lab instructors

Assess practical performance, design projects, demonstrations, reports, viva, teamwork, and ethics using rubrics.

OBE committee members

Review traceability from COs to assessment artifacts and support continuous quality improvement (CQI).

Students and reviewers

Understand expectations, evidence, and transparent performance criteria in EEE courses.

2. Research summary: assessment and rubrics in OBE

Important distinction: This page separates formal requirements from recommended practice. BAETE/Washington Accord documents define outcome-oriented accreditation expectations. Local assessment implementation must follow BUET EEE and BUET academic rules. Rubrics, blueprints, and evidence checklists are recommended practice unless explicitly adopted by the department.

Key findings from accreditation and engineering-education sources

FindingLocal interpretation for BUET EEEStatus
OBE requires evidence that students achieve stated outcomes, not only that topics are taught.Each EEE course file should preserve traceable links from COs to questions, rubrics, marks, scripts, project artifacts, and attainment analysis.Accreditation-aligned expectation
Washington Accord-style graduate attributes include engineering knowledge, problem analysis, design/development, investigation, modern tool usage, ethics, teamwork, communication, project management, and lifelong learning.EEE rubrics are needed for outcomes that cannot be validly assessed by routine numerical exam answers alone.Accreditation-aligned expectation
Constructive alignment connects intended outcomes, teaching-learning activities, and assessment tasks.CO wording, Bloom level, question type, and assessment instrument should agree; a high-level design CO should not be assessed only by substitution-type problems.Recommended practice
Rubrics improve transparency and rater consistency when criteria and performance descriptors are explicit.Lab reports, project demonstrations, viva, peer evaluation, presentation, and design reports should use analytic rubrics with preserved score sheets.Recommended practice
Direct assessment uses student work; indirect assessment uses perception or supporting evidence.Final questions, class tests, lab reports, demonstrations, viva, and projects are direct evidence. Surveys, course exit feedback, and student reflections are indirect/supporting evidence.Recommended practice
Formative assessment supports improvement during learning; summative assessment supports judgment at the end.Class tests and progress reviews may provide formative feedback and summative marks; final examination and final project demonstration are primarily summative.Recommended practice
Assessment quality depends on validity, reliability, fairness, transparency, and moderation.EEE course files should include question blueprints, marking schemes, moderated question papers, rubrics, and post-exam CO-wise analysis.Recommended practice

Why rubrics are needed for selected POs

Routine written problems can assess engineering knowledge and some problem analysis outcomes, but they are often insufficient for design, investigation, ethics, teamwork, communication, and contextual considerations. Rubrics allow assessors to record the quality of design choices, evidence-based interpretation, safe practice, responsible conduct, individual contribution, and communication quality. A rubric does not automatically prove PO attainment; it provides structured evidence that can feed the department-approved attainment process.

Direct vs indirect assessment

Direct assessment evaluates actual student work, such as answer scripts, lab reports, design files, hardware/software demonstrations, viva responses, project reports, and presentation performance. Indirect assessment captures perception or supporting information, such as course exit surveys, alumni feedback, employer feedback, or student reflections. Direct evidence should normally be the primary basis for CO attainment.

Formative vs summative assessment

Formative assessment supports learning during the course through feedback, such as class tests, lab observations, proposal reviews, and progress presentations. Summative assessment judges achievement after instruction, such as term final examination, final lab test, final demonstration, viva, and final report.

Validity, reliability, fairness, and constructive alignment

An assessment is more valid when it actually measures the intended CO and Bloom level. It is more reliable when different assessors would apply similar judgment. It is fairer when students know criteria and comparable students are judged by comparable evidence. It is constructively aligned when the CO, learning activity, assessment task, marking scheme, and evidence record point to the same learning target.

3. BUET EEE assessment context

EEE 303-like theory courses

For BUET EEE theory courses, the assessment structure is fixed and should not be changed by this guideline. Faculty should instead align the fixed components with COs, Bloom levels, question tags, evidence collection, and attainment-support tables.

ComponentWeightOBE useEvidence to preserve
Class participation10%Supports engagement; should not be used as the only evidence for technical CO attainment.Attendance/participation record and policy note.
Continuous assessment / class tests20%Normally four class tests. Useful for formative feedback and direct CO evidence when question-wise marks are CO-tagged.CT question papers, CO tags, sample scripts, marks table.
Term final examination70%Three-hour comprehensive exam with Section A and Section B. Compulsory questions in both sections can provide direct CO assessment evidence.Question paper, moderation record, compulsory question marking scheme, CO-wise mark extraction, sample scripts.

EEE 304-like laboratory/sessional courses

Laboratory and sessional courses have more flexible assessment instruments. An EEE 304-like Digital Systems Laboratory can assess practical skill, Verilog/FPGA implementation, tool usage, debugging, project design, technical documentation, ethics, teamwork, communication, and contextual considerations through a combination of lab performance, reports, practical tests, quiz, project proposal, progress review, demonstration, viva, peer evaluation, and final report/user manual.

InstrumentPossible evidenceSuitable CO/PO evidence
Lab performanceInstructor observation, circuit/setup check, instrument/tool use, debugging recordPractical implementation, modern tool usage, safety, psychomotor outcomes
Lab reportData, observations, analysis, limitation discussion, conclusionInvestigation, analysis, technical communication
Lab test / practical testIndividual implementation, troubleshooting, vivaIndividual technical competence and tool use
Design projectProposal, design choices, constraints, implementation, validation, demo, reportDesign/development, modern tools, contextual considerations, communication
Peer evaluation and vivaIndividual contribution, accountability, team role, technical understandingTeamwork, ethics, individual accountability

4. Recommended page structure

Introduction and source notes

Explain the aim, the fixed theory-course structure, and the distinction between official rules and recommended practice.

BUET EEE assessment structure

Summarize theory and lab/sessional assessment patterns using EEE 303 and EEE 304 as local reference cases.

Theory-course assessment model

Give CT blueprint, final-exam Section A/B compulsory-question model, question-to-CO mapping, and evidence table.

Laboratory/sessional model

Provide lab performance, report, lab test, project, viva, peer evaluation, presentation, and report assessment plan.

Rubric design principles

Explain analytic rubrics, criteria, levels, descriptors, weights, rater consistency, moderation, and transparency.

Interactive tools and templates

Provide blueprint builder, CO tagger, rubric builder, lab planner, evidence checker, attainment support calculator, and checklist.

5. BUET EEE theory-course assessment guideline

The fixed 10% + 20% + 70% structure should be treated as the assessment frame. OBE work is done inside this frame by making the assessment traceable: each selected question/sub-question should identify the CO, PO, Bloom level, marks, assessment type, and whether it is used as direct CO evidence.

Recommended workflow

  1. Write 3–5 measurable COs with Bloom levels appropriate to the course.
  2. Prepare a CO-wise assessment coverage plan across four class tests and final examination.
  3. Use class tests to sample early and mid-course COs and provide feedback.
  4. Design Section A and Section B compulsory questions to cover the most important direct CO evidence.
  5. Tag every relevant sub-question with CO, PO, Bloom level, marks, and direct evidence status.
  6. Prepare marking schemes or solution rubrics for compulsory questions.
  7. Extract CO-wise marks after evaluation and preserve the evidence trail.
Recommended practice: A small sub-question should normally assess one primary CO. A multi-part question may assess more than one CO only when marks are separated and each part has a defensible mapping.

Example CO set for an EEE 303-like course

COSample outcome statementLikely Bloom levelTypical evidence
CO1Analyze and implement combinational digital systems using Boolean algebra, logic gates, MSI components, and design constraints.C3–C4Class tests, final Section A/B sub-questions
CO2Analyze and design sequential digital systems using flip-flops, counters, registers, timing concepts, and finite-state-machine models.C4–C5Class tests, final compulsory questions
CO3Design digital systems using HDL/Verilog concepts and physical realization constraints, including design choices and justification.C5–C6Final compulsory design question, lab/project linkage where applicable

Sample class test blueprint

Class testMain topicPrimary COBloomSuggested evidence use
CT-1Boolean simplification and combinational logicCO1C2–C4Direct evidence for CO1
CT-2Decoder, encoder, MUX, arithmetic circuitsCO1C3–C4Direct evidence for CO1
CT-3Flip-flops, registers, countersCO2C3–C4Direct evidence for CO2
CT-4FSM and HDL-oriented design interpretationCO2/CO3C4–C5Direct evidence if marks are separated

Sample final examination Section A/B CO assessment plan

SectionQuestionCompulsory?Target COBloomMarksEvidence note
AQ1(a–c)YesCO1C3–C415Combinational analysis/design with explicit mark split
AQ1(d)YesCO3C55Design choice/justification required
BQ5(a–c)YesCO2C4–C515Sequential/FSM analysis and synthesis
BQ5(d)YesCO3C5–C65HDL/realization trade-off discussion

6. Compulsory-question guideline for term final examination

Compulsory questions in Section A and Section B are useful for OBE because all students answer them; therefore, they provide a common direct evidence base for CO assessment. Their design must be intentional and documented.

Recommended structure

Use a clear stem, separated sub-parts, explicit marks, one primary CO per sub-part, Bloom tags, and a marking scheme that states what earns credit.

Multi-CO questions

A compulsory question may assess multiple COs only if the parts are separable, the mark distribution is explicit, and the evidence extraction table can allocate marks to the relevant COs.

Higher Bloom levels

To justify C5/C6 or design mapping, the question should require comparison, synthesis, constraints, trade-off analysis, decision-making, or justification.

Evidence collection

Preserve the moderated question paper, marking scheme, evaluated scripts or samples, CO-wise mark table, and short post-exam analysis.

Warnings for final-exam mapping

  • Do not claim CO3/design attainment from a purely numerical substitution problem.
  • Do not map PO3/design unless the question requires design choices, constraints, synthesis, or justification.
  • Do not map PO4/investigation unless data analysis, experiment interpretation, uncertainty, or evidence-based inference is assessed.
  • Do not map PO5/modern tool usage unless actual tool use or tool-based interpretation is assessed.

7. Laboratory/sessional-course assessment guideline

An EEE 304-like course can use experiments in the first part and a design project in the second part. The assessment plan should combine group performance with individual accountability. Group artifacts may demonstrate system-level achievement, but individual viva, peer evaluation, logbooks, and contribution records are needed to support individual CO attainment.

Assessment areaHow to assessEvidence
Psychomotor/practical skillObserve setup construction, wiring discipline, measurement, debugging, and equipment handling.Instructor rubric, lab notebook, photos where allowed, practical test record.
Modern tool usageAssess use of EDA tools, FPGA boards, simulation, measurement instruments, or embedded development tools.Tool output, code/design file, screenshots, viva, demonstration.
Design/developmentRequire problem definition, requirements, alternatives, constraints, implementation, validation, and trade-off justification.Proposal, design review, demo, report, viva.
Ethics and responsible practiceAssess integrity declaration, safety awareness, honest limitation reporting, responsible component/data/tool use.Ethics statement, viva record, report section, instructor observation.
TeamworkCombine peer evaluation, individual viva, contribution log, and instructor observation.Peer forms, individual marks, role documentation.
CommunicationAssess oral presentation, demo explanation, report, user manual, and response to questions.Presentation rubric, report rubric, user manual, viva notes.
Recommended practice: For project assessment, separate group artifact marks from individual marks. A practical distribution may include group demonstration/report marks plus individual viva, peer evaluation, and role documentation.

8. Rubric design principles

A rubric is a scoring guide that defines assessment criteria, performance levels, descriptors, and weights. For complex outcomes, an analytic rubric is usually preferable because it separates dimensions such as technical correctness, design justification, safety, teamwork, and communication.

Analytic rubric

Scores multiple criteria separately. Recommended for lab performance, project demonstration, viva, report, presentation, teamwork, and ethics.

Holistic rubric

Gives one overall judgment. Useful for quick grading but weaker for CO/PO evidence because the basis of judgment is less traceable.

Recommended 4-level scale

LevelGeneral meaning
ExcellentComplete, accurate, independent, well justified, and professionally presented.
ProficientMostly correct with minor gaps that do not compromise the main outcome.
DevelopingPartially correct but with important omissions, weak justification, or limited evidence.
UnsatisfactoryMissing, incorrect, unsafe, unsupported, or not assessable.

Descriptors should avoid vague labels such as “good,” “average,” and “poor” unless they are explained by observable evidence. Rubric scores may be converted to marks by multiplying the level score by criterion weight. The resulting marks can support CO attainment only when the criterion is explicitly mapped to the relevant CO.

9. Sample assessment templates

Theory course assessment blueprint
AssessmentQuestionCOPOBloomMarksDirect evidence?Artifact
CT-1Q1CO1PO(a)C310YesCT script
CT-2Q2CO1PO(c)C410YesCT script
Final Section ACompulsory Q1(a–c)CO1PO(a)/PO(c)C415YesFinal script
Final Section BCompulsory Q5(a–d)CO2/CO3PO(a)/PO(c)C4–C520YesFinal script
CO-wise mark distribution table
COClass testsFinal compulsory questionsOther final questionsTotal direct marksEvidence strength
CO110151035Strong
CO210151035Strong
CO35101025Moderate; ensure design evidence is authentic
EEE 304-like lab assessment plan
ComponentSample weightPrimary evidenceIndividual/group
Class participation and preparedness10%Attendance and preparation recordIndividual
Lab performance20%Setup, tool use, debugging, safetyIndividual/small group with individual notes
Lab reports20%Analysis, data, interpretation, conclusionIndividual or group with individual contribution
Lab test and final quiz20%Individual implementation and conceptual understandingIndividual
Final project proposal/progress/demo/viva/report30%Design, validation, ethics, teamwork, communicationGroup artifact plus individual viva/peer evidence
Course-file evidence checklist template
  • Approved course outline with COs, POs, Bloom levels, and assessment plan.
  • CO-assessment mapping table and question paper blueprint.
  • Class test questions, answer schemes, selected scripts, and CO-wise marks.
  • Final examination paper with Section A/B compulsory questions tagged to COs.
  • Moderation record and post-exam analysis.
  • Rubrics for lab/project/report/presentation/viva/peer evaluation where applicable.
  • CO-wise attainment-support table and CQI action note.

10. Rubric library

The following rubrics are ready-to-adapt. Weights are suggested values and may be modified by the course teacher if the department/course committee approves the assessment plan. Keep criterion-to-CO mapping explicit.

11. Interactive tools

All tools run fully offline in the browser. No data is sent to a server. Tables generated here are support tools; official attainment calculation should follow approved department policy.

A. Theory Course Assessment Blueprint Builder

Submit the form to generate a CO-wise assessment coverage table.

B. Compulsory Question CO Tagger

Submit the form to generate a blueprint row and warnings.

C. Rubric Builder

CriterionWeightExcellentProficientDevelopingUnsatisfactoryAction

Generate a rubric to preview Markdown.

D. Lab/Project Assessment Planner

Submit the form to generate a lab/project plan.

E. CO Evidence Checker

Submit the form to check evidence sufficiency.

F. Attainment Support Calculator

Submit the form to calculate a support value.

G. Assessment Evidence Checklist

Completed: 0/0

12. JavaScript data model

The editable JavaScript data are stored in the appData object near the top of the script. Faculty can update POs, Bloom levels, rubric criteria, evidence types, warning rules, sample question plans, and checklist items without changing the user interface logic.

const appData = {
  theoryFixed: { classParticipation: 10, classTestsTotal: 20, classTests: 4, finalExam: 70 },
  bloomLevels: ["C1 Remember", "C2 Understand", "C3 Apply", "C4 Analyze", "C5 Evaluate", "C6 Create"],
  pos: ["PO(a) Engineering Knowledge", "PO(b) Problem Analysis", "PO(c) Design/Development", ...],
  rubricTemplates: { "lab performance": [["Preparedness",10], ...] },
  evidenceTypes: ["Final compulsory question", "Class test", "Lab report", "Project demonstration", ...],
  checklist: ["Course outline", "CO-assessment mapping", ...]
};

13. Common mistakes and corrective actions

MistakeWhy it is weakCorrective action
Assessing a CO only by attendance.Attendance does not directly demonstrate technical learning.Use questions, scripts, reports, practical tasks, viva, or project artifacts.
Mapping all questions to all COs.Destroys traceability and inflates evidence.Map each sub-question to the primary CO it actually assesses.
Using routine calculation as PO3/design evidence.Design requires choices, constraints, synthesis, or justification.Add design alternatives, constraints, and trade-off justification.
Claiming PO4/investigation without data interpretation.Investigation requires evidence, experiment, data, uncertainty, or inference.Assess measurement, data analysis, error/limitation discussion, and inference.
Claiming PO5/modern tool usage without assessed tool use.Tool use must be performed or interpreted by students.Preserve code, simulation output, instrument data, screenshots, or demonstration evidence.
Claiming PO8/ethics without explicit evidence.Ethics cannot be inferred from normal project completion.Assess integrity, safety, responsible practice, limitations, and ethical reflection.
Claiming PO9/teamwork using only group marks.Group marks do not show individual teamwork contribution.Use peer evaluation, individual viva, contribution logs, and instructor observation.
Claiming PO10/communication without communication artifact.Communication requires assessed oral/written performance.Use presentation, viva, technical report, user manual, or poster rubric.
Using vague rubric descriptors.Raters may interpret “good” or “average” differently.Write observable descriptors for each performance level.
Using only marks without preserving evidence.Attainment becomes untraceable.Keep question/rubric evidence, sample scripts, and CO-wise mark tables.

14. References and source notes

The following sources should be cited or consulted when finalizing the page. Verify document versions before publication.

  1. Board of Accreditation for Engineering and Technical Education (BAETE), accreditation manuals and OBE-related documents: https://www.baetebangladesh.org/
  2. International Engineering Alliance, Graduate Attributes and Professional Competencies: https://www.ieagreements.org/
  3. ABET, Criteria for Accrediting Engineering Programs and assessment resources: https://www.abet.org/accreditation/
  4. Anderson, L. W. and Krathwohl, D. R. (eds.), A Taxonomy for Learning, Teaching, and Assessing, revised Bloom’s taxonomy.
  5. Biggs, J., constructive alignment in university teaching and assessment.
  6. Brookhart, S. M., rubric design and assessment validity.
  7. BUET undergraduate academic rules and regulations, as applicable to term final, class tests, and course assessment.
  8. Department of EEE, BUET, local course outlines for EEE 303 Digital Systems and EEE 304 Digital Systems Laboratory.
Source note: This page intentionally uses cautious wording. Where a practice is not a stated BAETE or BUET rule, it is labelled as recommended practice. The department should update links and exact policy wording when official documents are revised.

15. Integration notes for Grav

Suggested file pathuser/pages/03.academics/01.undergraduate/obe/assessment-rubric-guide/default.md
Suggested page titleAssessment and Rubric Design Guideline for EEE Courses
Suggested menu textAssessment & Rubric Guideline
BacklinkAdd a link back to the OBE landing page and from the OBE landing page to this page.
Splitting laterMove CSS to theme/css/obe-assessment.css and JavaScript to theme/js/obe-assessment.js after testing.
Editable dataEdit the appData object in the script: POs, Bloom levels, rubric templates, evidence tools, and checklist items.

CO–PO Mapping Guide Curriculum

Back to Top