1. Page purpose and target users
The purpose of this page is to help BUET EEE faculty design defensible assessment evidence for Course Outcomes (COs) and Program Outcomes (POs), while respecting the department’s established theory-course assessment structure. The guide is public-facing so that students and stakeholders can understand how learning is assessed, but the templates and tools are designed for faculty preparing course outlines, question papers, rubrics, course files, and accreditation evidence.
Faculty members
Prepare CO-aligned assessment plans, class tests, final-exam blueprints, rubrics, and evidence tables.
Course coordinators
Check balance across sections, COs, Bloom levels, and evidence sources before finalizing course files.
Paper setters and moderators
Use compulsory questions and marking schemes as direct CO evidence without overclaiming PO mapping.
Lab instructors
Assess practical performance, design projects, demonstrations, reports, viva, teamwork, and ethics using rubrics.
OBE committee members
Review traceability from COs to assessment artifacts and support continuous quality improvement (CQI).
Students and reviewers
Understand expectations, evidence, and transparent performance criteria in EEE courses.
2. Research summary: assessment and rubrics in OBE
Key findings from accreditation and engineering-education sources
| Finding | Local interpretation for BUET EEE | Status |
|---|---|---|
| OBE requires evidence that students achieve stated outcomes, not only that topics are taught. | Each EEE course file should preserve traceable links from COs to questions, rubrics, marks, scripts, project artifacts, and attainment analysis. | Accreditation-aligned expectation |
| Washington Accord-style graduate attributes include engineering knowledge, problem analysis, design/development, investigation, modern tool usage, ethics, teamwork, communication, project management, and lifelong learning. | EEE rubrics are needed for outcomes that cannot be validly assessed by routine numerical exam answers alone. | Accreditation-aligned expectation |
| Constructive alignment connects intended outcomes, teaching-learning activities, and assessment tasks. | CO wording, Bloom level, question type, and assessment instrument should agree; a high-level design CO should not be assessed only by substitution-type problems. | Recommended practice |
| Rubrics improve transparency and rater consistency when criteria and performance descriptors are explicit. | Lab reports, project demonstrations, viva, peer evaluation, presentation, and design reports should use analytic rubrics with preserved score sheets. | Recommended practice |
| Direct assessment uses student work; indirect assessment uses perception or supporting evidence. | Final questions, class tests, lab reports, demonstrations, viva, and projects are direct evidence. Surveys, course exit feedback, and student reflections are indirect/supporting evidence. | Recommended practice |
| Formative assessment supports improvement during learning; summative assessment supports judgment at the end. | Class tests and progress reviews may provide formative feedback and summative marks; final examination and final project demonstration are primarily summative. | Recommended practice |
| Assessment quality depends on validity, reliability, fairness, transparency, and moderation. | EEE course files should include question blueprints, marking schemes, moderated question papers, rubrics, and post-exam CO-wise analysis. | Recommended practice |
Why rubrics are needed for selected POs
Routine written problems can assess engineering knowledge and some problem analysis outcomes, but they are often insufficient for design, investigation, ethics, teamwork, communication, and contextual considerations. Rubrics allow assessors to record the quality of design choices, evidence-based interpretation, safe practice, responsible conduct, individual contribution, and communication quality. A rubric does not automatically prove PO attainment; it provides structured evidence that can feed the department-approved attainment process.
Direct vs indirect assessment
Direct assessment evaluates actual student work, such as answer scripts, lab reports, design files, hardware/software demonstrations, viva responses, project reports, and presentation performance. Indirect assessment captures perception or supporting information, such as course exit surveys, alumni feedback, employer feedback, or student reflections. Direct evidence should normally be the primary basis for CO attainment.
Formative vs summative assessment
Formative assessment supports learning during the course through feedback, such as class tests, lab observations, proposal reviews, and progress presentations. Summative assessment judges achievement after instruction, such as term final examination, final lab test, final demonstration, viva, and final report.
Validity, reliability, fairness, and constructive alignment
An assessment is more valid when it actually measures the intended CO and Bloom level. It is more reliable when different assessors would apply similar judgment. It is fairer when students know criteria and comparable students are judged by comparable evidence. It is constructively aligned when the CO, learning activity, assessment task, marking scheme, and evidence record point to the same learning target.
3. BUET EEE assessment context
EEE 303-like theory courses
For BUET EEE theory courses, the assessment structure is fixed and should not be changed by this guideline. Faculty should instead align the fixed components with COs, Bloom levels, question tags, evidence collection, and attainment-support tables.
| Component | Weight | OBE use | Evidence to preserve |
|---|---|---|---|
| Class participation | 10% | Supports engagement; should not be used as the only evidence for technical CO attainment. | Attendance/participation record and policy note. |
| Continuous assessment / class tests | 20% | Normally four class tests. Useful for formative feedback and direct CO evidence when question-wise marks are CO-tagged. | CT question papers, CO tags, sample scripts, marks table. |
| Term final examination | 70% | Three-hour comprehensive exam with Section A and Section B. Compulsory questions in both sections can provide direct CO assessment evidence. | Question paper, moderation record, compulsory question marking scheme, CO-wise mark extraction, sample scripts. |
EEE 304-like laboratory/sessional courses
Laboratory and sessional courses have more flexible assessment instruments. An EEE 304-like Digital Systems Laboratory can assess practical skill, Verilog/FPGA implementation, tool usage, debugging, project design, technical documentation, ethics, teamwork, communication, and contextual considerations through a combination of lab performance, reports, practical tests, quiz, project proposal, progress review, demonstration, viva, peer evaluation, and final report/user manual.
| Instrument | Possible evidence | Suitable CO/PO evidence |
|---|---|---|
| Lab performance | Instructor observation, circuit/setup check, instrument/tool use, debugging record | Practical implementation, modern tool usage, safety, psychomotor outcomes |
| Lab report | Data, observations, analysis, limitation discussion, conclusion | Investigation, analysis, technical communication |
| Lab test / practical test | Individual implementation, troubleshooting, viva | Individual technical competence and tool use |
| Design project | Proposal, design choices, constraints, implementation, validation, demo, report | Design/development, modern tools, contextual considerations, communication |
| Peer evaluation and viva | Individual contribution, accountability, team role, technical understanding | Teamwork, ethics, individual accountability |
4. Recommended page structure
Introduction and source notes
Explain the aim, the fixed theory-course structure, and the distinction between official rules and recommended practice.
BUET EEE assessment structure
Summarize theory and lab/sessional assessment patterns using EEE 303 and EEE 304 as local reference cases.
Theory-course assessment model
Give CT blueprint, final-exam Section A/B compulsory-question model, question-to-CO mapping, and evidence table.
Laboratory/sessional model
Provide lab performance, report, lab test, project, viva, peer evaluation, presentation, and report assessment plan.
Rubric design principles
Explain analytic rubrics, criteria, levels, descriptors, weights, rater consistency, moderation, and transparency.
Interactive tools and templates
Provide blueprint builder, CO tagger, rubric builder, lab planner, evidence checker, attainment support calculator, and checklist.
5. BUET EEE theory-course assessment guideline
The fixed 10% + 20% + 70% structure should be treated as the assessment frame. OBE work is done inside this frame by making the assessment traceable: each selected question/sub-question should identify the CO, PO, Bloom level, marks, assessment type, and whether it is used as direct CO evidence.
Recommended workflow
- Write 3–5 measurable COs with Bloom levels appropriate to the course.
- Prepare a CO-wise assessment coverage plan across four class tests and final examination.
- Use class tests to sample early and mid-course COs and provide feedback.
- Design Section A and Section B compulsory questions to cover the most important direct CO evidence.
- Tag every relevant sub-question with CO, PO, Bloom level, marks, and direct evidence status.
- Prepare marking schemes or solution rubrics for compulsory questions.
- Extract CO-wise marks after evaluation and preserve the evidence trail.
Example CO set for an EEE 303-like course
| CO | Sample outcome statement | Likely Bloom level | Typical evidence |
|---|---|---|---|
| CO1 | Analyze and implement combinational digital systems using Boolean algebra, logic gates, MSI components, and design constraints. | C3–C4 | Class tests, final Section A/B sub-questions |
| CO2 | Analyze and design sequential digital systems using flip-flops, counters, registers, timing concepts, and finite-state-machine models. | C4–C5 | Class tests, final compulsory questions |
| CO3 | Design digital systems using HDL/Verilog concepts and physical realization constraints, including design choices and justification. | C5–C6 | Final compulsory design question, lab/project linkage where applicable |
Sample class test blueprint
| Class test | Main topic | Primary CO | Bloom | Suggested evidence use |
|---|---|---|---|---|
| CT-1 | Boolean simplification and combinational logic | CO1 | C2–C4 | Direct evidence for CO1 |
| CT-2 | Decoder, encoder, MUX, arithmetic circuits | CO1 | C3–C4 | Direct evidence for CO1 |
| CT-3 | Flip-flops, registers, counters | CO2 | C3–C4 | Direct evidence for CO2 |
| CT-4 | FSM and HDL-oriented design interpretation | CO2/CO3 | C4–C5 | Direct evidence if marks are separated |
Sample final examination Section A/B CO assessment plan
| Section | Question | Compulsory? | Target CO | Bloom | Marks | Evidence note |
|---|---|---|---|---|---|---|
| A | Q1(a–c) | Yes | CO1 | C3–C4 | 15 | Combinational analysis/design with explicit mark split |
| A | Q1(d) | Yes | CO3 | C5 | 5 | Design choice/justification required |
| B | Q5(a–c) | Yes | CO2 | C4–C5 | 15 | Sequential/FSM analysis and synthesis |
| B | Q5(d) | Yes | CO3 | C5–C6 | 5 | HDL/realization trade-off discussion |
6. Compulsory-question guideline for term final examination
Compulsory questions in Section A and Section B are useful for OBE because all students answer them; therefore, they provide a common direct evidence base for CO assessment. Their design must be intentional and documented.
Recommended structure
Use a clear stem, separated sub-parts, explicit marks, one primary CO per sub-part, Bloom tags, and a marking scheme that states what earns credit.
Multi-CO questions
A compulsory question may assess multiple COs only if the parts are separable, the mark distribution is explicit, and the evidence extraction table can allocate marks to the relevant COs.
Higher Bloom levels
To justify C5/C6 or design mapping, the question should require comparison, synthesis, constraints, trade-off analysis, decision-making, or justification.
Evidence collection
Preserve the moderated question paper, marking scheme, evaluated scripts or samples, CO-wise mark table, and short post-exam analysis.
Warnings for final-exam mapping
- Do not claim CO3/design attainment from a purely numerical substitution problem.
- Do not map PO3/design unless the question requires design choices, constraints, synthesis, or justification.
- Do not map PO4/investigation unless data analysis, experiment interpretation, uncertainty, or evidence-based inference is assessed.
- Do not map PO5/modern tool usage unless actual tool use or tool-based interpretation is assessed.
7. Laboratory/sessional-course assessment guideline
An EEE 304-like course can use experiments in the first part and a design project in the second part. The assessment plan should combine group performance with individual accountability. Group artifacts may demonstrate system-level achievement, but individual viva, peer evaluation, logbooks, and contribution records are needed to support individual CO attainment.
| Assessment area | How to assess | Evidence |
|---|---|---|
| Psychomotor/practical skill | Observe setup construction, wiring discipline, measurement, debugging, and equipment handling. | Instructor rubric, lab notebook, photos where allowed, practical test record. |
| Modern tool usage | Assess use of EDA tools, FPGA boards, simulation, measurement instruments, or embedded development tools. | Tool output, code/design file, screenshots, viva, demonstration. |
| Design/development | Require problem definition, requirements, alternatives, constraints, implementation, validation, and trade-off justification. | Proposal, design review, demo, report, viva. |
| Ethics and responsible practice | Assess integrity declaration, safety awareness, honest limitation reporting, responsible component/data/tool use. | Ethics statement, viva record, report section, instructor observation. |
| Teamwork | Combine peer evaluation, individual viva, contribution log, and instructor observation. | Peer forms, individual marks, role documentation. |
| Communication | Assess oral presentation, demo explanation, report, user manual, and response to questions. | Presentation rubric, report rubric, user manual, viva notes. |
8. Rubric design principles
A rubric is a scoring guide that defines assessment criteria, performance levels, descriptors, and weights. For complex outcomes, an analytic rubric is usually preferable because it separates dimensions such as technical correctness, design justification, safety, teamwork, and communication.
Analytic rubric
Scores multiple criteria separately. Recommended for lab performance, project demonstration, viva, report, presentation, teamwork, and ethics.
Holistic rubric
Gives one overall judgment. Useful for quick grading but weaker for CO/PO evidence because the basis of judgment is less traceable.
Recommended 4-level scale
| Level | General meaning |
|---|---|
| Excellent | Complete, accurate, independent, well justified, and professionally presented. |
| Proficient | Mostly correct with minor gaps that do not compromise the main outcome. |
| Developing | Partially correct but with important omissions, weak justification, or limited evidence. |
| Unsatisfactory | Missing, incorrect, unsafe, unsupported, or not assessable. |
Descriptors should avoid vague labels such as “good,” “average,” and “poor” unless they are explained by observable evidence. Rubric scores may be converted to marks by multiplying the level score by criterion weight. The resulting marks can support CO attainment only when the criterion is explicitly mapped to the relevant CO.
9. Sample assessment templates
Theory course assessment blueprint
| Assessment | Question | CO | PO | Bloom | Marks | Direct evidence? | Artifact |
|---|---|---|---|---|---|---|---|
| CT-1 | Q1 | CO1 | PO(a) | C3 | 10 | Yes | CT script |
| CT-2 | Q2 | CO1 | PO(c) | C4 | 10 | Yes | CT script |
| Final Section A | Compulsory Q1(a–c) | CO1 | PO(a)/PO(c) | C4 | 15 | Yes | Final script |
| Final Section B | Compulsory Q5(a–d) | CO2/CO3 | PO(a)/PO(c) | C4–C5 | 20 | Yes | Final script |
CO-wise mark distribution table
| CO | Class tests | Final compulsory questions | Other final questions | Total direct marks | Evidence strength |
|---|---|---|---|---|---|
| CO1 | 10 | 15 | 10 | 35 | Strong |
| CO2 | 10 | 15 | 10 | 35 | Strong |
| CO3 | 5 | 10 | 10 | 25 | Moderate; ensure design evidence is authentic |
EEE 304-like lab assessment plan
| Component | Sample weight | Primary evidence | Individual/group |
|---|---|---|---|
| Class participation and preparedness | 10% | Attendance and preparation record | Individual |
| Lab performance | 20% | Setup, tool use, debugging, safety | Individual/small group with individual notes |
| Lab reports | 20% | Analysis, data, interpretation, conclusion | Individual or group with individual contribution |
| Lab test and final quiz | 20% | Individual implementation and conceptual understanding | Individual |
| Final project proposal/progress/demo/viva/report | 30% | Design, validation, ethics, teamwork, communication | Group artifact plus individual viva/peer evidence |
Course-file evidence checklist template
- Approved course outline with COs, POs, Bloom levels, and assessment plan.
- CO-assessment mapping table and question paper blueprint.
- Class test questions, answer schemes, selected scripts, and CO-wise marks.
- Final examination paper with Section A/B compulsory questions tagged to COs.
- Moderation record and post-exam analysis.
- Rubrics for lab/project/report/presentation/viva/peer evaluation where applicable.
- CO-wise attainment-support table and CQI action note.
10. Rubric library
The following rubrics are ready-to-adapt. Weights are suggested values and may be modified by the course teacher if the department/course committee approves the assessment plan. Keep criterion-to-CO mapping explicit.
11. Interactive tools
All tools run fully offline in the browser. No data is sent to a server. Tables generated here are support tools; official attainment calculation should follow approved department policy.
A. Theory Course Assessment Blueprint Builder
B. Compulsory Question CO Tagger
C. Rubric Builder
| Criterion | Weight | Excellent | Proficient | Developing | Unsatisfactory | Action |
|---|
Generate a rubric to preview Markdown.
D. Lab/Project Assessment Planner
E. CO Evidence Checker
F. Attainment Support Calculator
G. Assessment Evidence Checklist
Completed: 0/0
12. JavaScript data model
The editable JavaScript data are stored in the appData object near the top of the script. Faculty can update POs, Bloom levels, rubric criteria, evidence types, warning rules, sample question plans, and checklist items without changing the user interface logic.
const appData = {
theoryFixed: { classParticipation: 10, classTestsTotal: 20, classTests: 4, finalExam: 70 },
bloomLevels: ["C1 Remember", "C2 Understand", "C3 Apply", "C4 Analyze", "C5 Evaluate", "C6 Create"],
pos: ["PO(a) Engineering Knowledge", "PO(b) Problem Analysis", "PO(c) Design/Development", ...],
rubricTemplates: { "lab performance": [["Preparedness",10], ...] },
evidenceTypes: ["Final compulsory question", "Class test", "Lab report", "Project demonstration", ...],
checklist: ["Course outline", "CO-assessment mapping", ...]
};
13. Common mistakes and corrective actions
| Mistake | Why it is weak | Corrective action |
|---|---|---|
| Assessing a CO only by attendance. | Attendance does not directly demonstrate technical learning. | Use questions, scripts, reports, practical tasks, viva, or project artifacts. |
| Mapping all questions to all COs. | Destroys traceability and inflates evidence. | Map each sub-question to the primary CO it actually assesses. |
| Using routine calculation as PO3/design evidence. | Design requires choices, constraints, synthesis, or justification. | Add design alternatives, constraints, and trade-off justification. |
| Claiming PO4/investigation without data interpretation. | Investigation requires evidence, experiment, data, uncertainty, or inference. | Assess measurement, data analysis, error/limitation discussion, and inference. |
| Claiming PO5/modern tool usage without assessed tool use. | Tool use must be performed or interpreted by students. | Preserve code, simulation output, instrument data, screenshots, or demonstration evidence. |
| Claiming PO8/ethics without explicit evidence. | Ethics cannot be inferred from normal project completion. | Assess integrity, safety, responsible practice, limitations, and ethical reflection. |
| Claiming PO9/teamwork using only group marks. | Group marks do not show individual teamwork contribution. | Use peer evaluation, individual viva, contribution logs, and instructor observation. |
| Claiming PO10/communication without communication artifact. | Communication requires assessed oral/written performance. | Use presentation, viva, technical report, user manual, or poster rubric. |
| Using vague rubric descriptors. | Raters may interpret “good” or “average” differently. | Write observable descriptors for each performance level. |
| Using only marks without preserving evidence. | Attainment becomes untraceable. | Keep question/rubric evidence, sample scripts, and CO-wise mark tables. |
14. References and source notes
The following sources should be cited or consulted when finalizing the page. Verify document versions before publication.
- Board of Accreditation for Engineering and Technical Education (BAETE), accreditation manuals and OBE-related documents: https://www.baetebangladesh.org/
- International Engineering Alliance, Graduate Attributes and Professional Competencies: https://www.ieagreements.org/
- ABET, Criteria for Accrediting Engineering Programs and assessment resources: https://www.abet.org/accreditation/
- Anderson, L. W. and Krathwohl, D. R. (eds.), A Taxonomy for Learning, Teaching, and Assessing, revised Bloom’s taxonomy.
- Biggs, J., constructive alignment in university teaching and assessment.
- Brookhart, S. M., rubric design and assessment validity.
- BUET undergraduate academic rules and regulations, as applicable to term final, class tests, and course assessment.
- Department of EEE, BUET, local course outlines for EEE 303 Digital Systems and EEE 304 Digital Systems Laboratory.
15. Integration notes for Grav
| Suggested file path | user/pages/03.academics/01.undergraduate/obe/assessment-rubric-guide/default.md |
|---|---|
| Suggested page title | Assessment and Rubric Design Guideline for EEE Courses |
| Suggested menu text | Assessment & Rubric Guideline |
| Backlink | Add a link back to the OBE landing page and from the OBE landing page to this page. |
| Splitting later | Move CSS to theme/css/obe-assessment.css and JavaScript to theme/js/obe-assessment.js after testing. |
| Editable data | Edit the appData object in the script: POs, Bloom levels, rubric templates, evidence tools, and checklist items. |