Bloom’s Taxonomy Guideline for Electrical and Electronic Engineering

Bloom’s Taxonomy Guideline for Electrical and Electronic Engineering

Department of EEE • OBE Academic Resource

Bloom’s Taxonomy Guideline for Electrical and Electronic Engineering

A practical guide for EEE faculty members to write measurable Course Outcomes, select discipline-appropriate action verbs, and align teaching, assessment, CO–PO mapping, and CQI evidence across theory, laboratory, simulation, design, project, and thesis courses.

Measurable Course Outcomes Write observable, assessable CO statements that support attainment analysis.
EEE-specific Action Verbs Choose verbs that fit circuits, electronics, labs, embedded systems, power, communication, and design work.
Assessment Alignment Match the intended cognitive level with appropriate evidence, rubrics, and CQI use.
Scope

This page synthesizes the revised Bloom’s Taxonomy, OBE design guidance, current BAETE accreditation criteria and definitions, Washington Accord style graduate-attribute guidance, and widely used engineering assessment practices. Where a point below is a planning recommendation rather than a published institutional rule, it is explicitly framed as a recommendation. Because current BAETE EEE program-specific criteria expect breadth and depth across analysis and design of complex electrical and electronic devices, software, and systems, EEE faculty generally need more discipline-specific CO wording than generic Bloom verb lists alone can provide.

In this page, CO refers to a course-level outcome, while PO refers to a graduation-level program outcome or graduate attribute. BAETE defines POs as what students are expected to know and be able to do by graduation, and defines direct assessment as direct observation of student knowledge or attitude against measurable learning outcomes.

Foundations for OBE and Accreditation

Why Bloom’s Taxonomy Matters in OBE

The revised Bloom framework gives faculty a common language for stating what students should be able to do and for matching those expectations with suitable assessment methods. In engineering OBE, this is not a cosmetic exercise: BAETE requires each course to facilitate the achievement of course outcomes through teaching-learning and assessment methods, to document learning plans and CO attainment, to show appropriate CO–PO correlation, and to demonstrate PO attainment through direct methods by graduation. Washington Accord style guidance likewise treats graduate attributes as individually assessable outcomes, while ABET defines assessment and evaluation as documented processes whose results are used for continuous improvement.

Writing measurable COs

Bloom helps move statements from topic coverage to observable performance. In EEE, “Understand op-amps” is too vague for attainment analysis, but “analyze and design op-amp circuits under stated gain and bandwidth constraints” gives clearer evidence targets.

Aligning teaching-learning activities

If a CO expects analysis of a protection scheme, instability mechanism, or sampled-data system, classroom and tutorial activities should include comparison, diagnosis, modelling, interpretation, and justification rather than only note-taking or routine substitution.

Designing assessments with valid evidence

Lower-level verbs typically need recall or explanation evidence, while higher-level verbs need design decisions, model interpretation, validation, critique, or prototype evidence. The assessment artifact should therefore fit the intended verb.

Supporting CO–PO mapping

Explicit verbs make it easier to map a CO to the right PO. For example, “conduct and interpret experiments” signals investigation-oriented attributes, while “design and justify a converter” aligns more naturally with design, modern tools, and communication evidence.

Distinguishing foundational and advanced learning

Basic circuit laws, device terminology, and standard procedures may sit at Remember–Understand–Apply, whereas controller tuning trade-offs, VLSI verification, renewable system optimization, and thesis methodology usually require Analyze–Evaluate–Create.

Feeding CQI with meaningful data

CQI becomes more useful when each CO states a measurable performance that can be sampled, rubric-scored, discussed in course review, and improved in the next offering. Vague COs create vague attainment data; clear COs create actionable CQI evidence.

Accreditation note

BAETE’s current criteria also place explicit responsibility on faculty for designing and updating curriculum, establishing outcomes, selecting appropriate assessment tools, and being adequately trained to set course outcomes and assess outcome achievement. This is one reason a Bloom-guided CO writing resource is useful at department level.

↑ Back to contents

The Six Bloom Levels in EEE

The six levels below follow the revised cognitive sequence Remember, Understand, Apply, Analyze, Evaluate, and Create. The definitions come from the revised taxonomy; the EEE contexts, verbs, and CO statements are discipline-specific recommendations synthesized for faculty use in Electrical and Electronic Engineering.

Remember

Foundational recall

Short definition. Retrieve relevant knowledge from memory.

Expected student capability. Recall laws, definitions, symbols, device regions, standard terminology, safety steps, and canonical block names used in EEE.

Suitable EEE contexts. KCL/KVL and passive sign convention in circuits; logic gate symbols and truth-table terminology in digital logic; fault type names in power systems; standard instrument ranges and terminal names in laboratory work; pn junction or photodiode terminology in semiconductor and photonics courses.

Typical assessment formats. MCQ, labelled diagram, oral spot question, short question, matching item, or structured identification task.

Sample action verbs. define, identify, state, list, label, name, recall, recognize.

Illustrative COs. Identify the laws, symbols, and sign conventions used in basic circuit analysis. State the operating regions and main terminal characteristics of diode, BJT, and MOSFET devices.

Understand

Meaning and interpretation

Short definition. Determine the meaning of instructional messages.

Expected student capability. Explain principles, compare alternatives, interpret plots and waveforms, classify behaviors, and summarize engineering relationships in words and diagrams.

Suitable EEE contexts. Explain negative feedback in op-amp circuits; interpret Bode plots in control systems; compare analog and digital modulation; explain sampling and quantization in signal processing; interpret calibration curves in instrumentation.

Typical assessment formats. Short explanation, annotated diagram, compare–contrast question, graph interpretation, concept mapping, oral explanation.

Sample action verbs. explain, compare, interpret, classify, summarize, illustrate, discuss, distinguish.

Illustrative COs. Explain how feedback affects gain, bandwidth, and stability in operational amplifier circuits. Interpret the time-domain and frequency-domain behavior of first- and second-order control systems.

Apply

Procedure in use

Short definition. Use a procedure in a given situation.

Expected student capability. Carry out calculations, structured analysis procedures, programming tasks, laboratory procedures, and guided simulations for known or moderately unfamiliar engineering problems.

Suitable EEE contexts. Solve nodal or mesh equations in circuits; calculate balanced three-phase power; implement a timer-interrupt routine in a microcontroller; simulate a communication link or digital filter; calibrate a sensor interface in the laboratory.

Typical assessment formats. Numerical problem, structured programming task, guided simulation, practical lab exercise, worksheet-based experiment.

Sample action verbs. calculate, solve, implement, simulate, use, configure, program, measure, calibrate, bias, sample, quantize.

Illustrative COs. Calculate currents, voltages, power, and transient response for specified linear circuit conditions. Implement and test peripheral interfacing and interrupt handling for a prescribed microprocessor or microcontroller application.

Analyze

Structure and diagnosis

Short definition. Break material into parts and determine how the parts relate to one another or to an overall structure or purpose.

Expected student capability. Decompose systems, derive relationships, locate causes of performance behavior, interpret interactions, and diagnose faults or limitations.

Suitable EEE contexts. Derive small-signal models in electronics; analyze transient classes in RLC circuits; differentiate modulation schemes under channel constraints; troubleshoot analog, digital, or embedded subsystems; analyze timing paths in VLSI; model semiconductor or optoelectronic device behavior from measured data.

Typical assessment formats. Derivation with interpretation, waveform/data analysis, fault diagnosis, structured troubleshooting viva, comparative technical memo.

Sample action verbs. analyze, derive, differentiate, model, troubleshoot, distinguish, estimate, examine, debug, characterize.

Illustrative COs. Analyze the dynamic response of RLC and feedback systems and interpret the effect of parameter changes on stability and damping. Troubleshoot mixed analog-digital circuits by interpreting measured waveforms and isolating probable fault locations.

Evaluate

Judgment with criteria

Short definition. Make judgments based on criteria and standards.

Expected student capability. Justify choices, validate results, critique alternatives, rank options, optimize parameters, and defend a technical decision using evidence.

Suitable EEE contexts. Justify relay or protective device settings in power systems; validate simulation against experiment in electronics or instrumentation; compare modulation schemes using BER, bandwidth, and power criteria; evaluate MPPT or storage choices in renewable energy; assess photonic or semiconductor device suitability for an application.

Typical assessment formats. Design review, comparative report, defended numerical/design problem, lab validation report, project presentation with rubric, thesis questioning.

Sample action verbs. evaluate, justify, validate, verify, critique, optimize, assess, rank, defend, select.

Illustrative COs. Evaluate alternative digital filter or controller designs against stated technical criteria and justify the selected solution. Validate measured laboratory results against theoretical or simulated predictions and explain discrepancies using uncertainty and non-ideal effects.

Create

New solution or method

Short definition. Put elements together to form a coherent or functional whole, or reorganize elements into a new pattern or structure.

Expected student capability. Design, develop, formulate, prototype, integrate, and communicate original or substantially open-ended engineering solutions or investigations.

Suitable EEE contexts. Design a regulated power supply or converter; develop an FPGA/Verilog system; formulate a communication receiver chain; create an embedded monitoring node; develop a VLSI testbench and verification flow; formulate and execute a thesis methodology.

Typical assessment formats. Open-ended design task, major project, prototype demonstration, thesis report, dissertation chapter, capstone review.

Sample action verbs. design, develop, formulate, synthesize, prototype, integrate, construct, fabricate, create.

Illustrative COs. Design and verify an electrical or electronic system that satisfies stated technical constraints, safety considerations, and performance targets. Develop and defend a thesis methodology for modelling, implementing, and validating an EEE problem solution using literature, simulation, experiment, or prototype evidence.

↑ Back to contents

Course and Assessment Alignment

Bloom’s Taxonomy in Electrical Engineering Courses

Recommendation

The table below is a planning aid rather than a fixed accreditation rule. BAETE and Washington Accord style frameworks require appropriate, assessable, and documented outcomes, but they do not prescribe one identical Bloom distribution for every EEE course. What matters is whether the selected level is appropriate for the course purpose, year level, content depth, and assessment evidence.

Recommended Bloom level emphasis by course type in Electrical and Electronic Engineering
Course type Typical Bloom emphasis EEE interpretation Illustrative examples
Introductory theory courses Remember, Understand, Apply Use for foundational knowledge, basic procedures, and early structured problem solving before extended open-ended judgment is expected. Basic Electrical Circuits, Electronic Devices, Digital Logic, Introductory Measurement.
Core analytical courses Apply, Analyze, with some Evaluate Move beyond formula substitution toward modelling, interpretation, and diagnosis of engineering behavior. Signals and Systems, Control Systems, Power Systems, Semiconductor Devices, Communication Systems.
Mathematical or modelling courses Understand, Apply, Analyze, Evaluate Students should not only reproduce transforms or state-space forms, but also interpret assumptions, compare models, and judge suitability. Probability and Statistics, Numerical Methods, DSP Modelling, Engineering Mathematics.
Laboratory courses Apply, Analyze, Evaluate Strong labs require procedure execution, data interpretation, uncertainty consideration, instrument use, and validation against theory or simulation. Circuits Lab, Electronics Lab, Instrumentation Lab, Power Lab, Embedded Lab.
Simulation or software courses Apply, Analyze, Evaluate, Create The higher value comes from model set-up, interpretation, debugging, comparison, and design decisions rather than only obtaining a screenshot or waveform. MATLAB or Simulink courses, SPICE-based Electronics, PSCAD studies, HDL simulation, DSP coding.
Design-oriented courses Analyze, Evaluate, Create Constraints, trade-offs, standards, safety, cost, and performance justification should be explicit. Power Electronics Design, Communication Link Design, Analog IC Design, Renewable Energy System Design.
Thesis or project courses Analyze, Evaluate, Create Students should frame a problem, review literature, select methodology, validate evidence, and defend conclusions. Final Year Project, Thesis, Independent Research, Capstone Design, VLSI or Embedded Design Project.

A common design pattern is to treat Remember and Understand as foundational, Analyze–Evaluate–Create as advanced, and Apply as a bridging level whose difficulty depends on novelty, complexity, and transfer. In EEE, a routine resistor network calculation and an open-ended MATLAB implementation are both “apply” in grammar, but not in cognitive demand. Context matters.

Recommended Bloom Levels by Assessment Type

The mapping below is a recommended EEE interpretation drawn from revised Bloom guidance, assessment-task examples associated with cognitive levels, and engineering alignment literature. It should be read as “best fit in typical use,” not as an absolute rule.

Recommended Bloom level fit by assessment type in Electrical and Electronic Engineering
Assessment type Usually suitable Bloom levels Usage notes for EEE
MCQ Remember, Understand; sometimes Apply or Analyze Useful for broad syllabus sampling. Can test reasoning if the stem requires waveform interpretation, fault identification, or method selection, but it is usually weak evidence for design or original creation when used alone.
Short question Remember, Understand, limited Analyze Good for definitions, explanations, comparisons, assumptions, and interpretation of small schematic or block-diagram cases.
Numerical problem Apply, Analyze Strong when the problem demands method choice, interpretation, or multi-step reasoning. Weak when it only rewards memorized substitution.
Derivation Understand, Apply, Analyze Best used when students must show assumptions, sequence of reasoning, and physical meaning, not merely reproduce memorized algebra.
Design problem Analyze, Evaluate, Create Should include explicit constraints such as efficiency, stability, safety, bandwidth, cost, power, device limits, or manufacturability.
Simulation assignment Apply, Analyze, Evaluate, Create High value when students justify model assumptions, compare with theory or experiment, and explain discrepancies. Raw screenshots alone are weak evidence.
Laboratory experiment Apply, Analyze, Evaluate Natural fit for measurement, calibration, characterization, data analysis, uncertainty discussion, troubleshooting, and validation.
Lab viva Understand, Apply, Analyze, Evaluate Useful for probing procedural reasoning, device behavior, assumptions, error sources, and interpretation of observed results.
Project report Analyze, Evaluate, Create Good for integrating problem framing, design logic, implementation evidence, verification, limitations, and future work.
Thesis defense Analyze, Evaluate, Create Strong evidence for independent inquiry, methodological justification, literature use, validation, and technical communication.
Presentation Understand to Evaluate or Create Its level depends on the task. A summary talk may only show understanding; a defended design proposal or thesis presentation may show evaluation and creation.
↑ Back to contents

Action Verbs and the Faculty Selector

EEE-Specific Action Verb Bank

Bloom verb lists are most useful when they are adapted to disciplinary tasks. The bank below combines widely used measurable verbs with EEE-specific tasks in theory, laboratory, simulation, and design environments. It is intended as a drafting aid for COs and assessment prompts.

EEE-specific action verb bank organized by Bloom level and engineering task type
Bloom level General measurable verbs Theory and problem solving Lab and instrumentation Simulation and software Design, project, and thesis
Remember define, identify, state, list, label, name, recall state assumptions, identify laws, list logic families, label blocks identify instruments, label terminals, state safety steps identify parameters, list software blocks, recognize outputs identify requirements, standards, constraints, objectives
Understand explain, compare, interpret, summarize, classify, illustrate, discuss explain feedback, compare devices, interpret plots, classify faults interpret calibration curve, explain procedure, compare readings interpret waveform, explain model assumptions, compare cases explain design rationale, compare alternatives, summarize literature
Apply apply, calculate, solve, implement, use, operate calculate, solve, bias, rectify, regulate, amplify, modulate measure, calibrate, test, sample, quantify, configure simulate, program, interface, implement, configure, demodulate implement, assemble, integrate, prototype, fabricate
Analyze analyze, model, derive, differentiate, distinguish, examine derive, analyze, estimate, differentiate, decompose, model characterize, troubleshoot, debug, isolate, examine error sources debug, profile, model, compare responses, trace timing, filter analyze requirements, examine trade-offs, diagnose failure modes
Evaluate evaluate, justify, validate, verify, critique, assess, optimize justify assumptions, assess method choice, optimize parameters validate data, verify measurement chain, critique procedure validate results, verify functionality, benchmark alternatives justify design choice, assess feasibility, optimize performance, defend conclusions
Create create, design, develop, formulate, synthesize, construct formulate model, synthesize controller or filter, create architecture develop test setup, create experiment, design workflow develop code, synthesize HDL, build simulation framework design, develop, prototype, integrate, fabricate, compensate, protect
Important note

Understand is a valid Bloom category label, but many outcome-writing guides advise against using “understand” as the final verb in a published CO because it is difficult to observe directly in assessment. For measurable CO wording, replace it with a verb that reveals the evidence you actually expect, such as explain, compare, interpret, classify, analyze, justify, or design. The same caution often applies to “know,” “learn,” “appreciate,” and “be familiar with.”

Also note that some EEE verbs are context-sensitive. For example, measure, model, test, verify, validate, synthesize, and optimize can sit at different Bloom levels depending on whether the task is routine, interpretive, judgment-based, or genuinely open-ended.

Interactive Action Verb Selector

Use the selector below to generate a practical starting point for CO drafting. The suggestions are intentionally recommendation-oriented: they should be adapted to course scope, credit hours, student level, teaching-learning activities, actual assessment tasks, and the department’s CO–PO mapping logic. In OBE, the verb, content, learning activity, and evidence should reinforce one another.

EEE Action Verb Selector

Select a Bloom level, course type, and EEE area. The tool will suggest action verbs, course-outcome stems, assessment methods, and a copyable CO template. It works entirely in the browser and does not require backend support.

Recommended action verbs

calculate solve analyze measure design

Example CO stems

  • Calculate and interpret engineering results for a stated EEE problem.
  • Apply appropriate methods and tools to solve a specified EEE task.

Suitable assessment methods

  • Numerical problem
  • Explanation-based question
  • Laboratory report
Suggested template only. Final wording should match the actual assessment evidence and expected performance level for the course.
Select all three fields for a tailored suggestion. Generic EEE guidance is currently shown.
↑ Back to contents

Writing Strong Course Outcomes

Weak vs Strong Course Outcome Examples

Weak COs usually fail because the verb does not reveal observable evidence. The improved examples below are not institutional policy statements; they are recommended rewrites that make the evidence, engineering content, and likely assessment path clearer.

Examples of weak and stronger EEE course outcome wording
Weak CO Why it is weak Improved measurable CO
Understand operational amplifiers. The verb is vague and does not show what evidence will demonstrate attainment. Analyze and design inverting and non-inverting operational amplifier circuits to meet stated gain and bandwidth requirements, and verify performance using calculation, simulation, or measurement.
Know Verilog. The wording does not specify performance, context, or evidence. Implement combinational and sequential digital modules in Verilog HDL, simulate their behavior, and verify functional correctness against a given specification.
Learn power system faults. “Learn” describes an intention, not a measurable student performance. Calculate symmetrical and unsymmetrical fault currents in a specified power network and justify suitable protection settings from the analysis.
Be familiar with communication systems. The phrase does not indicate observable evidence or cognitive level. Compare analog and digital modulation schemes in terms of bandwidth, noise immunity, and power efficiency, and select an appropriate scheme for a given communication scenario.
Understand microprocessors. The content area is broad and the verb is not directly assessable. Interface timers, interrupts, memory, and I/O peripherals in a microprocessor or microcontroller system and develop assembly or C routines for specified control and data-handling tasks.
Know measurement instruments. The statement does not specify whether the student should identify, use, calibrate, or evaluate instruments. Calibrate voltage, current, and sensor measurement instruments, estimate measurement uncertainty, and interpret recorded data for a specified experiment.

Common Mistakes to Avoid

Common pitfalls
  • Using vague verbs such as know, understand, learn, appreciate, or be familiar with, without clarifying the observable evidence.
  • Mapping all COs to low Bloom levels even in senior EEE courses where analysis, evaluation, design, investigation, or tool-based work is expected.
  • Using high-level verbs without high-level assessment, such as writing “design” or “evaluate” but testing only recall or routine calculation.
  • Writing COs that cannot actually be assessed with the available exam, lab, project, rubric, or course time.
  • Using “design” when the task is only calculation and no constraints, alternatives, trade-offs, or performance justification are involved.
  • Over-mapping one CO to too many POs, which can weaken the clarity of evidence and make attainment reporting less defensible.
  • Failing to match exam questions with the stated CO level, especially when course files or CQI reports later claim a higher cognitive level than the actual assessment shows.

These mistakes are repeatedly associated with weak measurability, poor constructive alignment, and low-value attainment data.

Practical CO Writing Formula

Action Verb + Engineering Content + Context/Condition + Expected Performance/Evidence

This formula is a practical recommendation synthesized from learning-objective and performance-indicator guidance. ABET performance-indicator guidance emphasizes observable action verbs and content, while learning-objective design guidance commonly adds conditions and criteria or standards. For EEE course files, this formula helps faculty tie verbs to engineering content and then to evidence that can be assessed, mapped, and used in CQI.

Formula-based examples of measurable EEE course outcomes
EEE context Formula-based CO example
Circuit analysis Analyze transient response in first- and second-order circuits using differential-equation and Laplace-domain methods for specified initial conditions, and verify the results through waveform plots or simulation.
Electronic circuits Design a single-stage BJT or MOSFET amplifier for stated gain, bias, and swing constraints, and justify the final component values through calculation and simulation.
Digital electronics Implement combinational and sequential logic modules in Verilog HDL for a given digital specification and verify functional correctness using simulation waveforms and test benches.
Power systems Calculate symmetrical and unsymmetrical fault currents in a specified power network using per-unit methods and justify protection settings from the computed results.
Embedded systems Interface sensors, timers, and serial peripherals with a microcontroller to acquire and process real-time data under stated timing constraints, and demonstrate correct operation on hardware.
Communication systems Compare digital modulation schemes for a stated channel condition and select the most suitable scheme based on bandwidth, BER, and power-efficiency considerations.
Laboratory course Calibrate a voltage, current, or sensor measurement setup using standard instruments, estimate measurement uncertainty from repeated observations and identified error sources, and interpret the resulting data.
Thesis or project Formulate, implement, and validate an EEE solution or investigation for a defined problem using literature, modelling, experimentation, simulation, or prototype evidence, and communicate defensible conclusions in written and oral form.

Quick Faculty Checklist

Before finalizing a CO, check the wording against the questions below. A good CO usually survives all of them.

Is the verb measurable? Can the student’s action actually be observed, judged, or rubric-scored?
Is the Bloom level appropriate? Does the cognitive demand fit the course level, credit load, and curriculum role?
Is the assessment aligned? Do exam items, lab tasks, simulations, projects, and rubrics really match the stated verb?
Is the CO suitable for CO–PO mapping? Can the outcome provide clear and limited evidence for the relevant PO or graduate attribute?
Can attainment be measured? Is there a realistic plan for direct evidence, analysis, and documentation in the course file?
Does the CO avoid vague wording? Replace know, understand, appreciate, and be familiar with unless you make the evidence explicit.
Is the CO achievable within the course? Senior project verbs are inappropriate if the course only offers introductory coverage and no supporting assessment.
Does the wording reveal evidence? A reader should be able to infer what the student will produce, solve, explain, measure, design, or defend.
↑ Back to contents

References and Further Reading

The following primary sources informed this guideline and provide reliable starting points for faculty, students, and accreditation visitors who want to go deeper.

  1. BAETE. Accreditation Policy (ACC-MAN-01, Version 3.0, effective 1 July 2024), including objectives of accreditation and OBE-based eligibility expectations.
  2. BAETE. Accreditation Criteria (ACC-MAN-02, Version 3.0, effective 1 July 2025), especially the sections on PEOs, POs, COs, CO–PO correlation, direct assessment, faculty responsibility, and CQI.
  3. BAETE. Program-Specific Criteria (ACC-MAN-03, Version 3.0, effective 1 July 2024), especially the EEE criteria on breadth, depth, advanced mathematics, communication theory where relevant, and analysis/design of complex electrical and electronic systems.
  4. BAETE. Definitions and Acronyms (ACC-MAN-06, Version 1.0, effective 1 July 2024), including definitions of direct assessment, indirect assessment, OBE, FYDP, PEO, PO, and CO.
  5. International Engineering Alliance. Graduate Attributes and Professional Competencies (Version 4, 2021), especially the treatment of graduate attributes as individually assessable outcomes and a reference point for outcomes-based accreditation criteria.
  6. ABET. Criteria for Accrediting Engineering Programs 2025–2026, particularly the definitions of student outcomes and continuous improvement, and the program criteria for electrical, electronic, computer, and communication engineering programs.
  7. ABET. Assessment Planning, especially the role of performance indicators, systematic assessment, and using results for decision making.
  8. ABET. Student Outcomes and Performance Indicators, especially the distinction between broad outcomes and concrete measurable performances, and the emphasis on action verb plus content.
  9. Iowa State University, Center for Excellence in Learning and Teaching. Bloom’s Taxonomy, including the revised taxonomy as a common language for learning goals and assessment methods.
  10. University of Illinois Chicago, Center for the Advancement of Teaching Excellence. Bloom’s Taxonomy of Educational Objectives and Learning Objectives, especially the revised level definitions, observable behavior, and alignment of objectives, activities, and assessments.
  11. Oregon State University. Student Learning Outcomes – Measurable Actions, particularly the measurable-verb guidance and the warning against vague verbs.
  12. University of Oxford, Centre for Teaching and Learning. An introduction to writing effective learning outcomes, particularly the emphasis on specificity and avoidance of vague wording.
  13. Johns Hopkins University, Office of the Provost. Writing Effective Learning Objectives / Educational Objectives, especially the three-part structure of performance, conditions, and criteria.
  14. New York State Education Department. Writing Performance Objectives, especially the distinction among performance, conditions, and criterion and the caution against vague verbs.
  15. Qadir, J. et al. Outcome-Based (Engineering) Education (OBE): International Accreditation Practices, American Society for Engineering Education, 2020, especially the discussions on OBE principles, constructive alignment, performance indicators, CLO–PLO mapping, assessment approaches, and CQI evidence.

Knowledge Profile, Complex Engineering Problem, Complex Engineering Activities Curriculum

Back to Top