So I’m checking out the Rodel blog and run across this gem from Dr. Paul Herdman:
As Board Chair of the Vision Network of Delaware, I am very pleased to welcome Dr. Dana Diesel Wallace as its new Executive Director. The Vision Network is a coalition of school districts and charter schools that are focusing their efforts on three specific areas proven to be critical to student performance: building leadership capacity, strengthening instructional focus, and developing a culture that supports student success. The Network is comprised of 28 schools across eight districts and three charter schools, impacting 23,000 students in all three counties.
After a national search that included more than a 100 applicants, the Vision Network board has selected Dr. Wallace, and we believe she will be a great fit. Dana began her new role Monday, filling the post vacated by Mark Murphy when he was appointed Delaware Secretary of Education. She most recently served as the Vice President of School Development for North Carolina New Schools, a public-private catalyst for education innovation. Much like the Vision Network of Delaware, only larger, North Carolina New Schools has joined with partners in business, education, and government to develop and support about a hundred secondary public schools across the state.
During her more than 20-year education career, Dana has been a teacher, principal, administrator, and superintendent. She has a bachelor’s degree in education from Old Dominion University, a master’s in educational leadership from Harvard University, and a doctorate in education from Teachers College at Columbia University. Dana worked for Wake County Public Schools as the senior director for middle school education, and then served as the superintendent of West Fargo Public Schools in North Dakota. As Superintendent, she and her team fundamentally redesigned their curriculum and established a dual enrollment program. In her time there, she moved her district from the 50th percentile in AYP across all grades to approximately the 90th.
Here are the reports from the North Dakota Department of Public Instruction (beginning with the year BEFORE her superintendency and ending with her final year):
I just don’t see the 50th to the 90th claim at all. Just look at the composite scores. Take note at the bottom of EACH PAGE :AYP not met, each and every year.
Is this the excellence VISION 2015 was seeking from 100 applicants? Really?
Lastly, why is it always the bloggers who find this crap out first? Purely rhetorical, but my guess is you have to be interested in the answer is a good place to start. To Rodel I must offer, it appears to be exactly the fit you wanted in keeping with your terrible track record of missing targets thus far firmly established by VIsion 2015!
So last night I was checking in on the survey and I started here: http://www.telldelaware.org/
and was greeted with an update:
Feb. 18 UPDATES: The statewide response rate is now over 54%! A special congratulations to the district of Smyrna who now joins Appoquinimink with every school in the district over the minimum response rate of 50% and there are seven more districts that are just a few schools shy of reaching this goal. There are now 147 schools across the state that have reached the minimum response rate and will have their own data to use in school improvement plans!
To view the response rate for every school in the state, please click on the “Response Rate” button located above.
Educators, this is your chance to be a part of the decision making and planning in your own school and district! If you need an anonymous access code to complete the survey, please contact the Help Desk by calling the toll-free number (1-855-258-2818) or click on the “Need Help” button located above.
So I thought to myself that I’d like to see the method for gaining a code by following their instructions and clicking on the NEED HELP button at the top of the page. That lead me here: http://www.telldelaware.org/help/index
Submit an issue
Subject of your inquiry
Your comment or question
So, I proceeded to fill out the request. I used a fake name, so I could protect my own anonymity (they don’t make that recommendation and I’m not sure why since this is a one request = one code form), told them I worked at Christina Early Education Center and used an aol.com email address. The response upon form submission indicated they would be in touch. This morning, I went into my email to find this:
From: TELL DE 2013 helpdesk <email@example.com>
Sent: Wed, Feb 20, 2013 7:37 am
Subject: Helpdesk Response for TELL DE 2013
Thank you for contacting the help desk. The code you requested is listed below
and is also located in the attached pdf. Please note that each code is unique
and may only be used once.
1 codes generated for Christina Early Education Center
This is regarding ticket #3945
Here is the .pdf:
I was a bit surprised. Obviously, I was not vetted. I am not the target audience. The fake name is not on a roster of eligible people. So I have some questions which I sent into the News Journal today. Nichole Dobo has posted them as an update to her blog here: http://blogs.delawareonline.com/delawareed/2013/02/16/deadline-extended-for-tell-delaware-survey/
Here they are:
- since the form asks for a name and IF someone uses their real name (fake names are not specifically advocated for on the form) when the single code is issued, as this one was, how is that person’s anonymity guaranteed since it is one-to-one assigned to the person?
- why did they not vet the name submitted against a roster of eligibles at CEEC to determine if a code should be offered?
- has anyone gotten more than one code, either through this mechanism or by getting additional codes from coworkers who did not want to take the survey for whatever reason?
- How many ineligible people have been issued codes?
I do have a few more questions:
How much money have we spent for the security operation around this survey?
Where is the oversight and quality control?
If I were the Governor, I’d be asking for my money back. The taxpayers deserve it.
Almost forgot, here’s the response rate update: http://www.telldelaware.org/progress/index up to 55.04% from 54.00% on 2/15/13 when it was extended, a slow slog
Union President not happy with me based on yesterday’s post: http://doesexperiencecount.wordpress.com/2013/02/20/holy-survey-batman/ not to worry Rodelians, we still both care about teachers far too much to not get to the other side of this kefuffle.
There are so many variables in a SLOP survey to control.
One common type of convenience sample produces surveys that researchers call self-selected opinion polls, or SLOP surveys. As the name suggests, the sample in a SLOP survey is not selected randomly. Instead, individuals choose whether to participate. Margin of sampling error cannot be estimated for a SLOP poll, no matter how large. The classic example of a convenience sample is one done by interviewers who stand in a shopping mall and ask shoppers as they walk by to fill out a survey. That’s perhaps a good way to meet new people but a bad way to select a representative sample of any group. The people who agree to participate may be different than those who do not.
Researchers have learned, often to their great embarrassment, that these types of samples often produce flawed results. Respondents who volunteer to participate in such surveys tend to be more extreme or otherwise very different in their views than those who do not. In no way can they be said to be representative of the population, so the survey results cannot be used to say anything useful about a target population.
The Internet is awash with SLOP polls that invite people to answer a question and then view the results. In addition to attracting only those with an ax to grind on a particular issue, even the best Internet-derived convenience samples currently tend to include too few older people, minorities and less affluent, less well educated. In short, they tend to miss people who don’t have access to a computer or an Internet connection. These surveys also invite manipulation, as a number of news organizations have learned to their dismay.
Even if the anonymity is protected, how can the DOE prevent survey manipulation like the simple kind I posited in my post title? Here’s what I can tell you: the TELL Delaware survey was at EXACTLY 54% on 2/15/13 and here we are on 2/19/13 at 54.78%. Looks like the extension is failing to drive participation. I wonder what the cause is. Perhaps it’s Facebook related.
That said, here’s a quick paper on how the style of survey employed here (Self selecting or SLOP) can be often sheer bunk:
I apologize for the lack of funny causation graphics to help make my point.