Actual SBAC Testing Conditions (California)

Facebooktwittergoogle_plusredditlinkedinmail

K12NN will be posting experiences teachers have as they prepare their students for the SBAC or PARCC Common Core State Standards tests or actually administer them as testing season rolls out across the country. The SBAC and PARCC tests are to be administered by computer and they function as the end-of-year summative assessment for students. These stories, while anonymous, are from actual classroom teachers and are real people verified by K12NN. Their identities are not disclosed to protect them from any possible political repercussions in the workplace.

From a California Teacher (SBAC), part III (read part I and part II here):

My math classes completed the SBAC tests this week. We were assigned two 2-hour blocks (one block on one day and another on a second consecutive day). My testing times backed up to the testing days I was to proctor so I spent FOUR hours for each of two days with the SBAC.

I had the opportunity to proctor two classes that were not my own students as well as two classes of my own students. The range I observed was 6th through 8th grades: low 6th, honors 6th, super honors 7th, and super honors 8th. The problems faced was astounding and ranged from Chromebook issues, server connection problems, unclear problems (as far as how to answer the problems), and mathematically inaccuracies.

I was able to debrief with my six classes today (most of the students, anyway, as a few were not in class as they were finishing up the testing). Let’s start with the Chromebooks: I had observed earlier, the necessity for students to ‘up’ the volume as soon as the Chromebooks were opened. Why? Because once the test was underway, it was near impossible to change the volume of the ‘speech to text’ or ‘aural’ problems. When I had mentioned this to our admin, it was said on deaf ears.

The first day, the classes I proctored spent upwards of 15 minutes simply getting logged on. Part of the problem was this ‘volume’ issue where students had to log back out, fix the volume control, and log back in. The admin in the room and I had it out; I was accused of ‘not being in the room when the soundcheck was done.’ Really? I don’t think so.

The so-called soundcheck took place AFTER the students had logged in, as part of the SBAC. I tried, to no avail, to explain the importance of checking the volume PRIOR to logging in. I gave up and made sure when I had my own students, to remind them that they needed to check the volume controls before we even began the test.

Continuing with the Chromebooks…some students had issues with them ‘freezing’ during the test. In other words, the cursor wouldn’t move no matter what they did. It wasn’t a ‘mouse’ or ‘trackpad’ issue either. Students used both to see if the problem was one or the other. Nope. Probably a software issue.

Another ‘freezing’ that occurred was that a particular page would take forever to load. Students were told to be patient, but a number of them had to reboot. This, to me, was a server issue. Despite having a designated ‘testing’ server, the inadequacies were apparent. A number of my students shared that, during the test, their Chromebooks shut off completely. In other words, the screen went black. Several students said this happened several times. How frustrating!

To log in, students had to input a ridiculously long state code as well as test code. For those who do not have computers at home and are unfamiliar with keyboarding, this was a slow, laborious, process. Despite what MOST people think, many of my students do not have computers at home.

Now, onto the test itself! Holy cow! Are you serious? The language arts portion had fewer issues than the math. The biggest problem with the language arts portion was the fact that the students were NOT permitted to have the use of pencil and paper. ALL of their work had to be done on the Chromebook, including note-taking. The fact that students were familiar with making a variety of graphic organizers to help with their writing was a non-issue to the test…students were forced to put ALL of their thinking into the Chromebook instead of being able to sketch out or outline what they wanted to do. As a former language arts teacher, this, to me, was insane. As an adult who prefers to use a computer, even I often write things down on paper first!

One student called me over to ask, “Is claims another word for bird?” I looked at him in askance. He clicked the cursor on the Dictionary and showed that when he clicked on the word ‘claims,’ the SBAC Dictionary defined it as ‘a type of bird.’ He was cracking up, thinking it was the funniest thing ever!

In the math section, some of the questions did NOT have enough information necessary to answer the question! One student shared, “Ms. S, I have to find the volume of a cylinder. The only information that is given is the circumference! How can I possibly figure out the volume without the height?!”

Another student had a problem where the graphics and wording conflicted with one another. He was confused, then started laughing. He said, “I guess horizontal means the same as vertical!” The picture showed a shape being folded VERTICALLY although the wording was that the shape was folded horizontally.

In another problem, the student was to type in an equation with a variable. The window with the symbols included a variable that looked like the X for multiplication but also a *. “Which one was the variable to use?” On the very next problem, the multiplication symbol was, gasp, an X.

One other, it took two proctors to figure out just HOW the student, who wanted to use a square root symbol as part of her answer, to actually type it in. Was this a test of a student’s knowledge? Or, a test on how to use a variety of commands designed by test-makers who didn’t collaborate and come to consensus for continuity? Oh, yes, there were a number of these types of asinine examples. Many students were rolling their eyes during the math portion, and shared later, that although the problems weren’t all that difficult, many didn’t make sense.

Oh, I forgot to mention something that was troubling…on the math test (it appears most of the problems were with the math portion of the SBAC) a number of students completed 25 out of the supposed 27 questions. The test indicated that the student had completed the exam but only ’25 out of 27′ kept showing up.

Students went back through all the given questions only to find that there were only 25 and NOT 27 as indicated. Weird… I find it reprehensible for so much money to have been spent not to mention time, on these tests. It cost a minimum of $2 for each pair of earbuds that had to be purchased so that each student had his/her own pair to use.

It cost at least $250 for each Dell Chromebook (I couldn’t find one for less than $279 with an educational discount), and 400 were purchased in addition to the specialized electrical cart that housed them. I don’t know if the mice were an additional cost or if they came with the Chromebooks. I have a sneaking suspicion they were an added price, particularly since no one knew that mice were available to be used with the Chromebooks until I shared, through my ‘practice’ sessions with my kids, the necessity for many to use a mouse.

Some may say that because this year is a ‘baseline’ year, we shouldn’t worry about the scores; yes, the scores that NO ONE will ever have access to. The students were stymied at why they were even taking the tests if no one was getting any feedback. I resent the fact that my students were forced into a test where their results were not shared; essentially, a meaningless test.

Who benefits from this year’s SBAC? Ask my students and they will tell you, on their own, that the manufacturers of the mice, earbuds, and Chromebooks are the winners. Students being used as guinea pigs? Hey, I had THAT experience last year! Two weeks of my precious instructional time was used up as the smartest math students in the district were utilized as ‘guinea pigs’ for the testmakers. The good thing that did come from that experience? The testmakers realized that 80 to 90 math problems were excessive…so, this year the students ‘only’ had 25 to 27 questions. Hmmm…to me? 25 to 27 is still an EXCESSIVE amount when the exam is obviously not designed to be of any use to anyone except for financial gain.

Click here for reuse options!
Copyright 2014 K-12 News Network's The Wire
Facebooktwittergoogle_pluslinkedinrssyoutube