Everyone else will talk about this special feature
(small classes! split by ability!) or that statistic (more people
take our course!). But if smaller classes raised scores more, why
don't we see it in their advertised results? If more people taking
the course meant it worked better, why don't we see it in advertised
TestWell's LSAT 180 Course has the highest independently-verified
average score increase of ANY LSAT Course in the country.
In 1998, that number was +9.5 points.
(Since then we've upgraded the Course three times, and the
latest figures from students reporting in show an average increase
up to +10.7 points. That's with 32% of the students checking
in, from our 2002 student body.)
By contrast, Kaplan's last independently-verified average
increase, which dates from 1993, was 7.2 points. Princeton's
advertised number is 7 points. TestMasters and PowerScore have never
provided any independent verification for any score increase claims
-- which is why they post none on their websites (though they make
all kinds of unfounded claims when you call their sales centers).
Of course, if you want to look at the popularity contest, consider
In almost every school at which we are directly competitive with
other test-prep firms -- making apples-to-apples comparisons possible
-- we usually outenroll the mass-market firms even as they out-advertise
us by (literally) hundred-to-one ratios.
At Harvard, we've been the #1-selling firm for every one of the
last seven years. At Wellesley, we've been #1 for the last six.