About the LDMA | LRJA | LSMA | LSUA

THE LRJA

view printer-friendly page

  • overview
  • putting the LRJA to work
  • taking the LRJA
  • individual report
  • summary report
  • client tools
  • the complexity crisis
  • psychometrics
  • FAQ
  • learn more

Overview

The LRJA (Lectical® Reflective Judgment Assessment) is a written online assessment of the way people think about knowledge, truth, and inquiry. Reflective judgment is closely related to critical thinking and metacognition (thinking about thinking). Well-developed reflective judgment skills support learning, problem-solving, and decision-making, and are a good predictor of academic and workplace success. The LRJA can be employed as a summative, Summative assessments are used to determine the level of competence of individuals or groups, often as part of a program evaluation or research project embedded, Embedded assessments are used as part of a lesson plan, like a written assignment might be used to help learners organize what they are learning. diagnostic, and/or Diagnostic assessments are used to find out what individuals already know or have learned, so an instructor or mentor can shape lessons to learner needs. formative Formative assessments are tests that are learning experiences in their own right, often because they provide rich, actionable feedback or support reflective engagement. assessment. For a discussion of reflective judgment research, instructional ideas, and resources click here.

The LRJA focuses on reasoning about:

  1. complexity—awareness of complexity as an issue, the nature of complexity, and the skills required to make sound and timely decisions under complexity;
  2. evidence—the information, facts, testimony, and opinions that relate to a particular issue, and how these are evauated before being used to form a conclusion;
  3. inquiry—the process of goal setting, information gathering, interpretation, and review;
  4. perspectives—awareness of perspectives and how they can be leveraged to improve decisions; and
  5. truth & certainty—awareness that complex decisions are always made under conditions of uncertainty, and the skills required to cope with this uncertainty.

It presents dilemma that involves conflicting expert perspectives on a thorny real-world problem, then presents a series of questions that ask the test-taker:

  • how it is possible that experts can come to very different conclusions;
  • what it is about complex problems that makes the truth difficult to determine;
  • how to go about gathering the information needed to form an opinion on such problems;
  • how to go about evaluating the quality of information obtained from sources; and
  • how certain we can be about our conclusions regarding complex issues.

LRJA scores are calibrated to the Lectical® Scale, and can therefore be compared to scores received on other Lectical Assessments. To learn more about the scale and its properties, click on the psychometrics tab.

As with all Lectical Assessments, LRJA results are presented in detailed test-taker and client reports that include a great deal of actionable feedback. To learn more about these reports, click on the individual report, summary report, and client tools tabs.

 

 

Putting the LRJA to work

The LRJA measures the level of skill with which individuals reason about key dimensions of reflective judgment that are not measured with conventional assessments, including complexity, evidence, inquiry, perspectives, and truth & certainty. It can be employed to support your work in a number of ways:

  1. Taking the LRJA is a reflective activity that draws a test-taker's attention to ideas and issues that are central to decision making under complexity.
  2. LRJA reports are an objective source of information about test-takers' development as decision makers.
  3. LRJA individual reports support educational efforts by explaining what test-takers are most likely to benefit from learning next, and suggesting specific learning activities that are tailored to the needs of the individual learner.
  4. LRJA summary reports make it possible to monitor group results or trace the development of individuals over time, with real-time, presentation quality graphics that will enhance your own reports.
By integrating the LRJA into practice, educators and mentors can diagnose and respond to the needs of individual learners in real time. It is currently employed as a formative assessment in one-on-one coaching, group learning and assessment, and large-scale organizational development. It also functions as a summative assessment in research and program evaluation contexts.

Coaches and educators

One-on-one coaching: The LRJA helps consultants customize their approach to fit the skill sets and learning needs of individuals by providing sophisticated diagnostics and detailed information about the quality and complexity of leaders' reflective judgment skills, as well as richly educative feedback to catalyze their growth.

Group training and assessment: The LRJA can help you meet the learning needs of individuals, no matter how large the group. Our reports highlight individual differences and provide tailored learning suggestions. They also monitor group level effects, such as the overall impact of your educational efforts. This combination of individual and group-level reporting provides a multidimensional view of learning, deepening your insights into individual learners while letting you know when it might be beneficial to adjust your learning goals or instructional approach. In other words, the LRJA facilitates what we call instructional dynamic steering™—the ability to shape instruction dynamically to meet the changing needs of individual learners.

Large-scale organizational development: The LRJA is an ideal tool for measuring the effects of change efforts designed to improve reflective judgment or critical thinking. It can be used to (1) describe, in detail, the range of reflective skills present at different levels of management, (2) assess the fit between employees' current capacity and the task demands of their jobs, and (3) design educational programs tailored to the needs of individuals and the organization as a whole.

Employers

Hiring decisons: The LRJA provides a nuanced and objective source of information about prospective employees' reflective judgment skills. We recommend its use as part of a portfolio of information about prospective employees.

Employee development: Taking the LRJA is a learning experience. It draws attention to important decision-making skills and concepts and provides an opportunity for thoughtful reflection. The individual report acknowledges what employees have already learned, makes individualized suggestions for what comes next, and provides specific, actionable suggestions for how to get there. Importantly, individual test takers can keep these reports private (our default setting), or they can choose to share them with a mentor. As an employer, you would be provided with dynamic summary reports (see the summary tab) that show a wide range of real-time results at the group level, making it possible to evaluate the overall fit between employees' skills and the decision making demands of their jobs.

Researchers

The LRJA is ideal for program evaluation, basic research, and action research:

Program evaluation: Problem- and skill-focused leadership education initiatives are proliferating rapidly, yet good measures of the skills these programs are intended to foster are more than rare—they are virtually nonexistent. With its focus on skills for coping with complexity, the LRJA is tailor-made for evaluating programs of this kind.

Basic research: When we score the LRJA, we make many decisions about performances, each of which is recorded as a data point. We use these data to calculate scale scores and populate reports. For researchers, they are a gold mine, making it possible to address many questions about leadership decision making. We work with graduate students and other researchers as part of our quest to see that these questions are asked. And if you’re interested in collecting your own data, our flexible client tools and well-organized assessment administration procedures can accommodate a wide variety of quasi-experimental designs with ease.

Action research: Many leadership researchers want their work to benefit the systems and individuals they are studying. By integrating the LRJA into pedagogy and practice, educator-researchers can diagnose the learning needs of individuals in real time while simultaneously conducting research on the development of skills that are the focus of instruction.

Taking the LRJA

The LRJA is for anyone who needs to make decisions about complex matters involving multiple legitimate perspectives. The essay questions in the LRJA require test-takers to respond to a dilemma. There are several existing (and tested) LRJA dilemmas, which makes it possible to provide a new dilemma each time an individual takes the assessment. This feature is especially useful if you wish to test the effectiveness of a course or training intervention. Moreover, we can customize dilemmas for particular groups. For example, we have developed special versions of the LRJA for use in workplace and educational contexts. The Global economy dilemma is shown here:

“The current global economy is based on an investment-based economic model that requires relatively constant growth. Economists who favor this model point to evidence that the human condition, overall, has improved as a consequence of this system, which has generated heretofore unheard of wealth and distributed it more widely than any previous system. They express great concern that any interference with the current system could cause a total economic collapse resulting in unprecedented human suffering. However, a growing number of economists are convinced that economic growth inevitably will be accompanied by increasing environmental degradation, contributing in the long term to even greater human suffering.”

If you would like to view other dilemmas, please contact us.

Check it out

You can examine the LRJA by logging on to the Lectica site, clicking on the LRJA icon, and choosing "take the LRJA". You will need to enter information into the starred fields before you will be allowed to move from one page of the assessment to the next. Feel free to cut and paste blocks of random text into the essay fields.

If you do not want to register on the site before trying the LRJA, use the following log in:

Jane Smith (test-taker): username=janesmith, password=janesmith

Note: If you wish to take an actual assessment, please contact us.

The individual report

Lectical assessments are distinguished by the quality and educational value of their feedback. To create a Lectical Assessment, we work with domain experts to identify core skills and concepts, use what we learn to develop initial research instruments, then study how the skills and concepts targeted by the assessment develop over time, using a research methodology called developmental maieutics. The approach allows us to describe what targeted skills and concepts "look like" in each developmental phase, and to craft feedback and learning suggestions that are specific to each phase of performance.

In addition to general feedback related to the phase of a given performance, LRJA reports, which are delivered online, include comments on strengths and areas for growth, interpretations of scale scores, and targeted learning recommendations.

Reports are generally delivered within 10 business days. We send an email notification each time an assessment is finalized.

If you would like to view an individual report, please contact us.

Check it out

Below are two log-ins that will allow you to view samples of LRJA reports. The first is for a test taker called Jane Smith, a fictional test taker who has taken 3 assessments. If you log in as Jane, you will be able to see her LRJA individual reports. The second is for a fictional client named Ann Brown, Jane's instructor/coach. If you log in as Ann, you will be able to view a summary report (of Jane's results).

Jane Smith: username=janesmith, password=janesmith

Ann Brown: username=annbrown, password=annbrown

Note: These log-ins can be used to view reports and to explore the test-taker and client applications. They should not be used to take an actual assessment.

The summary report

In addition to providing individual reports for our assessments, we offer a dynamic summary report that shows group trends. This report, which is delivered online, allows you to examine change over time on different constructs, or examine the relation between scores and various demographic variables. The report features presentation quality charts you can customize in real time, based on your needs. The information in summary reports can be used to guide instruction, monitor program effectiveness, and examine performance across groups.

Check it out

Below are two log-ins that will allow you to view samples of LRJA reports. The first is for a test-taker called Jane Smith, a fictional test-taker who has taken 3 assessments. If you log in as Jane, you will be able to see her individual reports. The second is for a fictional client named Ann Brown, Jane's instructor/coach. If you log in as Ann, you will be able to view a summary report (of Jane's results).

Jane Smith (test taker): username=janesmith, password=janesmith

Ann Brown (coach): username=annbrown, password=annbrown

Note: These log-ins can be used to view reports and to explore the test-taker and client applications. They should not be used to take an actual assessment.

Client tools

We offer a set of client tools that streamline a number of assessment managment tasks, including:

  1. registering test takers,
  2. assigning test takers to groups,
  3. creating and sending out assessment assignments,
  4. monitoring test-taking activity,
  5. maintaining test-taker accounts, and
  6. communicating with test takers.

We can also customize these tools to meet your particular needs. To learn more about client tools, please contact us.

The complexity crisis: Levels & tasks

During the last 20 years, scholars and practicioners in many disciplines have identified a growing gap between the complexity of 21st century life and work and the capabilities of individuals. This gap has contributed to what we refer to as a complexity crisis, in which we are forced repeatedly to make decisions without an adequate understanding of their ramifications. We created the LRJA primarily to help fill this gap, but also have been using LRJA results to learn more about the nature of the gap.

The scoring procedures we use to provide a Lectical® Score also can be used to determine the task demands of a wide range of jobs, problems, or issues. This makes it possible to compare task demands with the levels at which individuals think about those tasks.

levels and tasks

The figure above shows the relation between the task demands of four management positions (teal line) and managers' performance on the LDMA (our leadership decision making assessment). It includes the average Lectical scores for 512 government leaders. These manager/leaders were tested between 2002 and 2006 as part of a series of studies conducted for the federal government. The figure illustrates the growing gap between the task demands of successive management positions and the capabilities of individual leaders. (Reports on this research are available on our articles page.)

This trend is even more pronounced in our much larger sample of business managers. There is a clear gulf between the difficulty of upper-level management jobs and the ability of individual humans to meet those demands. The pattern is pervasive—we see it everywhere we look—and it reflects a hard truth. None of us is capable of meeting the task demands of the most complex situations in today’s world. We’ve come to believe that in many situations our best hope for meeting these demands is to (1) work strategically on the development of our own skills and knowledge, (2) learn to work closely with others who represent a wide range of perspectives and areas of expertise, and (3) use the best tools available to scaffold our thinking. It was this pattern that persuaded us to focus on skills like perspective taking and seeking, perspective coordination, collaborative capacity, and reflective judgment.

We aren’t alone. Others have observed and remarked upon this problem:

Jaques, E. (1976). A general theory of bureaucracy. London: Heinemann Educational.

Habermas, J. (1975). Legitimation crisis (T. McCarthy, Trans.). Boston: Beacon Press.

Kegan, R. (1994). In over our heads: The mental demands of modern life. Cambridge, MA: Harvard University Press.

Bell, D. (1973) The coming of post-industrial society. New York: Basic Books

Reliability and validity

Scores on our developmental scale (the Lectical® Scale) are determined with the Lectical Assessment System (the LAS) or with rubrics a system for awarding scores to written responses that involves choosing from a list of descriptions that represent thinking at different phases of development that have been calibrated to the Lectical Scale. This scale is composed of 14 levels—0 (birth) to 13 (Einstein)—each of which is divided into four phases. It is a non-arbitrary research-based developmental scale with excellent measurement properties. The reliability (internal consistency) of Lectical scores ranges from .95–.98, and inter-rater agreement is maintained at a minimum of 85% within 1/5 of a Lectical level. In plain English, this means that the Lectical scale reliably distinguishes 8–12 adult developmental "phases", where each phase represents 1/4 of a level.

Reasearch on the LAS has addressed four forms of construct validity: predictive, the extent to which performance on an assessment predicts behavior in the real world convergent/divergent, the extent to which one measure of a given dimension does the same thing as another measure of the same (or a different) dimension ecological, and the extent to which an assessment measures things that are of value in real life (The ecological validity of the LRJA is apparent in the relevance of (1) its content; (2) the skills required to complete it; and (3) the scores and feedback provided in its reports.) measure validity. the extent to which moving from one level to the next is the same, no matter which two levels you are looking at (as in, "All inches are exactly the same length", or "moving from level 2 to level 3 is just like moving from level 3 to 4".) The LAS is based on a strong theory of development called Dynamic Skill Theory, and has been submitted to a number of rigorous tests of its ability to capture the developmental construct described in that theory. These tests have shown that the LAS does a very good job capturing this dimension. To view the evidence, see the refereed papers on the articles tab on the literature page and articles by our colleagues on the decision making references page.

For an in-depth explanation of our approach to validity and reliability, we suggest that you read our information page on the validity and reliability of the LAS. To learn more about reliability and validity as constructs, see Dr. Dawson's blog.

FAQ

What is a developmental assessment?

A developmental assessment is a test of knowledge and thinking that is based on extensive research into how people come to learn specific concepts and skills over time. All good developmental assessments require test-takers to show their thinking by making written or oral arguments that support their judgments. Developmental assessments examine how people use their knowledge and thinking skills to solve problems. Typically, there are no “right” answers in a developmental assessment.

Our developmental assessments test thinking in specific areas of knowledge, such as leadership decision making, moral reasoning, or physics. Within each area of knowledge, the ability to work with complexity increases in a systematic way. Over the last 20 years, we have developed a system for measuring this growth reliably and accurately.

Our assessments are developmental because they identify where a person’s current reasoning fits in the sequence of skill development for the area of knowledge being assessed. This requires understanding the sequences and descriptions of successive levels of skill or understanding in a particular area of knowledge pathways the alternate routes people can take toward mastery of a concept or skill through which skills develop. Lectica is able to meet this objective because we have collected and analyzed thousands of clinical interviews and written responses, gradually building the knowledge base required to understand how specific skills develop over time in specific areas of knowledge. This research is widely published in peer reviewed journals, books, and on the web. To learn more, go to our literature page.

What does Lectical® mean?

Lectical is taken from the word, dialectical, which refers to a process for determining the “truth” by exchanging logical arguments. Our reference to this term is a celebration of the philosopher, Hegel, who proposed a dialogical truth building process with three repeating steps—thesis, antithesis, and synthesis. Many scholars think of development as a kind of natural dialectic, in that it results from feedback-rich interactions between an individual and the environment that change the way the individual thinks or behaves.

So, to recognize Hegel and the dialogical nature of development, we named our developmental scoring system the Lectical® Assessment System (LAS), and we call assessments that are developed or scored with the LAS, Lectical® Assessments.

What is a Lectical® Asssessment?

At Lectica, we specialize in designing and administering developmental assessments called Lectical® assessments. Lectical assessments are a major advance over conventional assessments, because they not only determine (1) what test takers know, but also (2) how well they apply their knowledge in real-world situations, and (3) what they need to learn next to advance to the next level of skill.

How does a Lectical Assessment work?

Lectical assessments are designed to measure the degree of cognitive complexity that underlies thinking—how much people can see and hold, and how much and how well they can elaborate, integrate, coordinate, and communicate what they see.

The Lectical Score is an index of the level of complexity demonstrated in a person’s performance. It provides information about the skill with which a person uses his or her knowledge to think about an issue. This is fundamentally different from most assessments, which focus primarily on factual knowledge or the application of learned procedures.

Lectical Assessments include other scores, which we call scale scores. These are based on the particular concepts and ideas expressed in a performance, and allow us to provide specific information about a test taker’s performance on specific sub-skills or themes.

What kind of information does a Lectical assessment provide?

At Lectica we believe that a good test should provide more than an accurate score. It should also support development. Lectical Assessments support development in two ways. First, they are learning experiences in their own right, drawing attention to important ideas and issues, and providing an opportunity for thoughtful reflection. Second, each assessment is accompanied by a report that is tailored to the learning needs of the individual test taker. Your clients (students, employees) will learn what they’ve accomplished, what comes next, and how to get there. And our summary reports allow you to view group results or follow test takers over time, with real-time, presentation quality data and graphics.

Learn more about Lectical® Assessments

To learn more about using Lectical Assessments, please contact us.

For more about our research and methods, we suggest the following links from our In Plainer English collection.

Virtuous cycles of learning (a white paper about the learning model upon which our assessment strategy is based)

How to take a Lectical Assessment (instructions for writing responses to assessment questions)

Introduction to the LAS (Lectical Assessment System)

A comparison of the LAS with other scoring systems

About measurement

Developmental maieutics

Constructing developmental sequences

Our levels and theirs (Table aligning lectical levels with the levels of other cognitive-developmental assessment systems)

If you are very curious (or academically inclined), you may also want to read some of the refereed articles on the "articles" tab on our literature page.