Quick: How do you find the volume of a cone?
It's safe to say that a vast majority of adult Americans don't know the answer to that question. And though he is working to improve STEM education, Sam Houston, CEO of North Carolina Science, Mathematics, and Technology Education Center, said at U.S. News' 2012 STEM Summit that there's no reason to expect a better outcome.
"Those are either mathematicians, engineers, or very strange people," Houston said of people who know off the top of their heads that a cone's volume equals pi times the radius squared times the height.
"In the adult world we don't use information like we taught it in school. [But] we spend so much energy valuing trivia like that," he said.
This illustrates just one problem that may arise in the area of STEM metrics. A student might learn and be tested on the formula for a cone's volume in 7th grade, for example, and completely forget it until whenever it shows up on another test.
Measuring STEM progress is a problem that plagues states nationwide. While Houston toils away in North Carolina, other states struggle with their own measurement issues.
Chris Roe, CEO of the California STEM Learning Network (CSLNet), spoke about some of his organization's STEM goals for California students: passing Algebra I by the 8th grade, for example. But another measure of progress, he says, is "STEM proficiency"--which he admits there is no good way to quantify.
For his part, Mark Lewis, senior program officer at Washington STEM, points to a quote attributed to Albert Einstein: "Not everything that counts can be counted, and not everything that can be counted counts."
Even though data on educational progress abound, from AP to IB scores, ACT and SAT test scores, GPAs and the like, Lewis says that some important indicators are impossible to count.
"We don't have the instrumentation to really understand, let's say, a student's 'STEM identity'--how they conceive of themselves in relation to whatever is 'STEM,'" he says.
These sorts of problems are inspiring similar organizations in California, North Carolina, and Washington to come up with new ways of counting their states' progress on STEM success. The conversation raises basic questions about what makes good, useful data, what "STEM literacy" is, and what are the desired outcomes of STEM education.
There are plenty of new ideas, but first, it may be necessary to correct a few issues on a basic level.
"Datasets are notoriously dirty," says Lewis, as different schools or education systems might count different outcomes in different ways.
That means that aggregating data from all of different sources can be a mess, which could make the prospect of improving STEM education all the more daunting.