A working space specific to the "Task A" subgroup on assessment tools and practices...


National Center for Postsecondary Research 2012 conference:

"Strengthening Developmental Education: What Have We Learned, and What's Next?"

(click on "Downloads" link for access to conference papers and presentations)


Pamela Burdman, August 2012: "Where to Begin: The Evolving Role of Placement Exams for Students Starting College" (Jobs for the Future paper, in conjunction with Achieving the Dream and Developmental Education Initiative)


New and relevant working paper from CCRC:

"The Opposing Forces that Shape Developmental Education: Assessment, Placement, and Progression at CUNY Community Colleges," by Jaggars and Hodara, November 2011



Joe Montgomery's "starting point" document--"Thoughts on Task 1" (renamed "A" to try to reduce confusion with larger Group 1 :-):


Joe also produced a nice summary of the CCRC "Assessment of Evidence" paper on developmental assessment and placement tests:

And here's a link to the actual study on the Community College Research Center web site...Also, here's a link to an interesting //Inside Higher Education// blog post referencing the study; it actually mis-characterizes the study findings a bit, but it does include a number of responses that are entertaining (and some even enlightening :-)...

Michael Collins' paper on the Developmental Education Initiative (available from the National Center for Postsecondary Research web page on the fall 2010 conference they hosted) includes some specific recommendations about state policy related to placement assessment:

  1. Standardized assessment and placement policies

  2. Diagnostics to differentiate need and intervention

  3. Policies prescribing early elimination of academic deficiencies

  4. Alternatives to developmental education for students near a certain cut score


Hawai Strategies Institute presentation (from Achieving the Dream conference):


I don’t know that much about the reading/writing side of things as most of my work over the past few years has been focused on the math side of things; from what I can tell it’s probably important for us to differentiate these areas as they seem to have some different significant issues/concerns. You can see a number of links to placement-related material (mostly, but not all, math-specific) on the TMP web site here (scroll to “Placement Test Issues”); one of the more interesting links is to California’s Mathematics Diagnostic Testing project (MDTP), offering a set of tests from Pre-algebra Readiness to Calculus Readiness). Another test option that’s relatively new and isn’t listed on the TMP is the MAA (Mathematical Association of America) partnership with Maplesoft to produce a suite of placement tests based on a set of paper-and-pencil tests that MAA used to provide (and that I understand were highly-regarded by quite a few math faculty in the system). And of course there’s also Washington’s own WAMAP (Washington Mathematics Assessment and Placement), where students can currently take a practice version of the College Readiness Math Test (CRMT) as well as a number of locally-developed diagnostic assessments (the former is available to anyone—click “Placement,” then enter a random name and phone number—while the latter are password-protected on the site)...
Bill



California CC system project on a "centralized assessment delivery system" (see my discussion post for more info on this topic; Joe Montgomery has also talked to a couple of CA system reps involved in this work...

Here's a thoughtful and brief piece on "directed self-placement" (DSP) from Dan Royer and Roger Gilles at Grand Valley State University; one could argue that what they're describing is more applicable to the relatively straightforward decision space around whether or not to enroll a given student in a college-level writing class at a 4-year institution, but I do think that the deeper conceptual questions they're framing work just as well if you substitute "math" for "writing," at least around the core decision of "college-ready or not"?


And here are 2 fairly typical implementations of DSP in a writing context: a "writing attitudes survey" and what's referred to on this site as the "Daly-Miller Survey"...


LINK TO MARCH 9 WEBINAR RECORDING: OVERVIEW OF PRECOLLEGE PLACEMENT ASSESSMENT LANDSCAPE (and here is the PPT file with the live links that I hope work!):


Compass Overview webinar, Stephanie Lewis (ACT):

ACT Webinar 3-8-11

Responses from ACT rep to questions posed by Regina Reed, Walla Walla CC:

1. Regarding reliability, how is that measured?
Reliability can be found in our manual.

2. What is the reliability of the e-Write writing placement compared to the “traditional” writing placement test?
This is a great question and one I have been asked a lot lately unfortunately we have not done any studies relative to comparing the two. I'm going to forward this information onto research again, so I can stress the importance of such a research endeavor.

3. We’ve studied COMPASS as a reliable predictor of student success (earning a 2.0 or higher) in the course in which the student places. We’ve seen that COMPASS is a good predictor of student success in writing but a poor predictor for math. What are some factors that can account for this difference?


Some of the predictors for math (as to why it might not be successfully placing students - or appear to be successfully placing students) could be a variety of things but off the top of my head I would ask the following questions:
a. when was the last time cut scores for the math courses were evaluated? what was the process? Cut scores are not meant to be static but rather dynamic. These should be evaluated every 1 1/2 - 2 years. It doesn't mean they change that frequently but it should be examined.
b. what (if any) is the retest policy? is remediation required before retesting?
c. is placement mandatory? Meaning COMPASS says you should take course X but then an advisory can override that decision or the student can say "thanks for the recommendation but I'm taking the next higher course" - the right to fail method.
d. are the diagnostics being used for students in decision zones? perhaps some students who just miss the class by a hair are merely rusty and a review of the areas in which they are struggling will help and they can go on to take a more challenging course. If students are just rusty and placed in a lower course, they have a high risk of failing the easier class because they become complacent. Additionally for those students who barely get into the class, diagnostics can be a tool that says, you made this challenging course however you have deficiencies in these areas. To be successful in this course you should do additional work in the areas outlined as needing improvement.


To view the recording of the Accuplacer overview webinar by Brett Miller, College Board, click the link below:

View the Elluminate //Live!// recording


ACCUPLACER Q & A (Brett Miller, College Board)



Can you select the level that a student starts at on the placement test?
No, but you can select the test; the student always starts at a medium difficulty item and then branches depending on the response.

What are the costs?
In Washington current costs are $1.95 per test unit (typically 5-6 units per student); cost could be reduced to as little as $1.55 per unit based on volume increases


Is it possible to add customized questions regarding student background, experiences, attitudes (e.g., about high school math course experience and grades, whether they took math as a senior, etc.)?
Yes

Are statistical analyses available on these background/supplemental questions comparable to what’s available on the standard test items?
This should be covered in the ACES Validity Study material posted below:


How are the background questions used in making placement decisions?
Currently answers to background questions help determine the appropriate starting test for students; multiple measures are used, with weights assigned to scores based on answers to background questions.

Are there plans to develop a diagnostic at the Intermediate Algebra level?

Not at the moment as far as I know; many of the content areas within the Elementary Algebra diagnostic could be considered intermediate algebra area.

Is the test able to route students directly from the regular tests into the diagnostics?
Yes

What’s the relative emphasis of procedural skills vs. conceptual understandings on the math diagnostic tests? We are not sure what this means, however I think the Program Manual might be able to answer this question. I’d be happy to send you a pdf of this manual (about 129 pages) but it has a clear index to point you in the Diagnostics area.

Can the Accuplacer diagnostic test be used as a pre-post measurement?
Yes; because tests are computer adaptive the items would likely not be exactly the same but they would be drawn from the same pool of items and comparable—a number of states are pursuing this approach at the high school level.

Are there reports/studies available on the diagnostic assessments?
In progress


April 4 webinar recording: "Discussing Principles, Policies, and Procedures for a Placement Assessment System"

Selected key slides from the presentation/discussion :


(with notes, comments, questions from working group volunteers)


From the recently-published College Completion Toolkit report produced by the US Department of Education--nothing particularly earth-shatteringly new here, but I thought it was worth highlighting the one relevant recommendation related to our work group effort:

Use early assessment of college readiness to reduce the need for remedial education at the postsecondary level.

==States can encourage secondary schools and adult education programs to upgrade curricular rigor to higher education expectations by incorporating college placement exam questions into state high school tests and requiring public colleges to use consistent "cut scores" on those tests for placement in freshman-level courses. California, for example, added a series of college-readiness questions to the state’s 11th-grade exam. After students take the test, they are told whether they are on track for credit-bearing classes at colleges in the California State University system, so that both the student and high school can make necessary adjustments in the subsequent course of study. Without that system, many high school graduates and GED holders enter college only to take placement tests and discover they are not ready for college-level work.




==
Archived recording of the April 21 webinar with Wendy Swyt, Highline CC: "Writing Placement: A Tense Interface"
Slides here:



(from Regina Reed, Walla Walla): Here is my e-write data; we used longitudinal data to review trends in pass rates pre and post intervention (e-write).We are still collecting information as this is a new tool for our campus but the initial data is promising. We wouldn't say it is flawless but it doesn't appear to have a negative impact on Pre College English Placements.