With members of the Board of Education and school administration acknowledging they’re not sure what Artificial Intelligence (AI) will mean to students or society in general, a proposed policy aims to begin codifying this unknown.
Superintendent Kevin Smith shared a first draft of a Generative Artificial Intelligence (AI) Policy — among several others — with the BOE on Thursday night, Dec. 19, explaining that, among other things, it codifies the establishment of a district AI Task Force.
“When the AI committee had met last year one of their topics over the course of the year was if there’s a policy, what should be in it, and so that group did generate some policy recommendations,” Smith said.
Wilton Public Schools is introducing AI into the classroom as a teaching tool next month as part of a pilot program.
“This policy really does two things,” Smith said. “One, it recognizes the potential of AI.”
And while he said there’s a general belief that it could prove to be “a really strong learning tool,” they also understand that it’s evolving and that they don’t really know the extent of its potential.
“The second thing this does is kind of formally establish a district AI Task Force that is going to be responsible for a number of things,” he said, including engaging in conversations about AI within the district and deepening understanding of what it is, how it’s changing and how it can serve the schools.
The Task Force will also be recommending policy updates, as well as establishing guideline for AI use by both students and staff, coupled with augmenting professional development for staff and training for students, as well as generally serving to educate the district as a whole about AI and AI-related topics.
“So we have an AI Task Force in place and it’s been doing some of these things, but this (is) kind of formal direction-setting,” Smith said.
According to the policy draft, “Generative Artificial Intelligence refers to software that employs machine learning algorithms to co-create text, media, and other content forms based on prompts from the user. As with all technologies, users must be mindful of and adhere to all considerations ensuring responsible and ethical use, especially as it relates to mitigating bias, promoting transparency, and ensuring the benefits of AI are accessible to all students.”
BOE Chair Ruth DeLuca praised the policy for cross-referencing the district’s responsible-use policy, as well as the state’s Student Data Privacy act.
“It probably ought to cross-reference back to that academic honesty piece so they’re linked in together,” she said, noting a policy draft discussed earlier in the meeting.
According to that new policy draft, “the district will not tolerate academic dishonesty,” including cheating and plagiarism, and specifies work or materials generated by artificial intelligence.
Smith said the policy would be there to “support and enhance the handbook regulations” for student behaviors and conduct.
The question was raised as to whether AI itself could be trusted to furnish appropriate citations for the material it generates, and whether or not this could, inadvertently, cause students to break the policy.
“Does the AI tool itself do a good enough job of citing its sources? …” BOE member Pat Pearson asked. “We’re going to have them use this technology. We have to be very specific about this piece of it.”
BOE member Heather Priest asked whether the teachers had had a chance to look at this policy.
“I’d just be curious what their thoughts were,” she said.
Smith said he would try to get some response from them before the next BOE review in January.
“I think it’s important for the value of academic honesty to be staked out,” DeLuca said, “and to be put in place, especially as traditional notions of integrity and honesty shift over time as technology advances.”
“I do like that it focuses on the positive, that expectation, and the understanding is that we operate in a community that values honesty and behaves in that manner,” she said.


