© 2025 254 North Front Street, Suite 300, Wilmington, NC 28401 | 910.343.1640
News Classical 91.3 Wilmington 92.7 Wilmington 96.7 Southport
WHQR
BBC World Service
WHQR
BBC World Service
Next Up: 6:00 AM Living on Earth
0:00
0:00
BBC World Service
WHQR
0:00 0:00
Available On Air Stations

Sunday Edition: Big Picture

Security camera at the New Hanover County Schools Board of Education building.
Benjamin Schachtman
/
WHQR
Security camera at the New Hanover County Schools Board of Education building.

Sunday Edition is a weekly newsletter from WHQR's News Director Benjamin Schachtman, featuring behind-the-scenes looks at our reporting, context and analysis of ongoing stories, and semi-weekly columns about the news and media issues in general. This week: a deep dive on the stakes and concerns for the AI-security pilot in New Hanover County schools.

This week, the New Hanover County Board of Education shot down a pilot program to install AI-powered surveillance technology in district schools. The $3.2 million state-funded program promised to provide real-time monitoring of the school district’s hundreds of cameras, delivering alerts to principals and other administrators when the software detected things like surging crowds, unsecured entrances, weapons, or smoke and fire.

The pilot also prompted intense debate about privacy, data security, personal liberties, long-term funding concerns, and the political origins of the grant program. While there’s been a fair amount of misinformation thrown in the mix, there’s also plenty of legitimate concerns. It’s not my intention (or my job) to tell people how they should feel about this program — but I do want to try to lay out, best I can, what the stakes are.

I say that because the Board of Education may take another run at the issue at their regular meeting on Tuesday, April 1 — but also because the AI issue is not going away, nor is the violence in our schools that’s pushing faculty, staff, legislators, parents, students, and community members to look for solutions.

Even if this iteration of the AI pilot doesn’t move forward in New Hanover County, a $2 million version is proceeding in Davidson County — and we could see an additional pilot elsewhere.

Longer term, I think it’s reasonable to think we’ll eventually see a push for statewide implementation. If Eviden’s technology costs a few million to install and several hundred thousand dollars a year to run for New Hanover or Davidson, then rolling it out to roughly 115 school districts is likely to mean hundreds of millions of dollars in implementation and millions more in annual operating costs. That’s a lot of money — public money — and deserves some serious consideration.

How did we get here?

Law enforcement outside of New Hanover High School on August 30, 2021.
Kelly Kenoyer
/
WHQR
Law enforcement outside of New Hanover High School on August 30, 2021.

Based on my conversations with legislators and local officials, it seems the AI pilot has its origins in the violent incident at New Hanover High School in the summer of 2021, where a shooting sent one student to the hospital and another, ultimately, to prison. While violence in schools — and the broader downtown community — was not new, this incident touched a nerve, and kicked off a flurry of meetings, ultimately resulting in New Hanover County directing significant funding to address the problem.

According to several accounts I’ve heard, it also inspired Republican State Senator Michael Lee to begin looking for ways to address the problem. That led to 2023 legislation offering $5.2 million for a pilot program in New Hanover County — Lee’s district — and also Davidson County, represented by Republican State Senator Steve Jarvis. The desire appears to have been to find different-sized counties with different needs that could still be served by the same vendor. The bill requires the vendor approved by whichever county signed a contract first to be used by both counties.

It appears that vendor was always intended to be Eviden, a tech company based in France. While the original bill allowed several companies to bid to RFPs in New Hanover and Davidson, updated language that was included in the Helene relief bill passed in late November last year narrowed the field (and also, importantly, kept the clock running on the pilot).

Eviden, as some critics have noted, doesn’t currently work in any U.S. K-12 schools, but it’s done plenty of work for airports, college campuses, and other businesses — as well as high-profile work for the Olympics in Paris and the World Cup in Qatar. (An acquaintance of mine working in digital security tech, who asked I not use their name, told me, “The Qataris bought the World Cup, you think they can’t buy high-end security?”)

The fact that Eviden is a foreign company has also rankled some people in both New Hanover and Davidson, who have voiced security concerns — but also economic arguments against sending millions of taxpayer dollars out of the country instead of keeping them closer to home by hiring a U.S.-based company (or by eschewing AI software entirely in favor of hiring more employees here in North Carolina).

Eviden has offered to address the data security concerns by hosting all information on a local server instead of using cloud-based storage. Whether that local system would — or could — be air-gapped to the satisfaction of all the critics I’ve talked to is unclear, but it appears Eviden is willing to negotiate with districts to address this kind of concern.

And there are domestic AI security companies, of course. One name I’ve heard referenced repeatedly is ZeroEyes, a Pennsylvania-based firm offering a system that, like Eviden’s, works with existing cameras. The company has contracts in 43 states, including many public schools — the result of aggressive lobbying.

“In several states where ZeroEyes has lobbyists registered, lawmakers have funded school safety purchases, sometimes issuing requirements so specific that ZeroEyes is the only legal competitor, despite there being other similar products on the market,” according to StateScoop, an online news outlet that tracks public sector technology issues.

Thanks to state contracts and federal support, ZeroEyes has been incredibly successful, posting 300% revenue growth for 2023. Eviden is obviously eager to take a part of that market — and North Carolina seems a ripe location. ZeroEyes doesn’t appear to have any registered lobbyists in the state; Eviden registered two last spring and now has four.

ZeroEyes has received plenty of accolades, but it has also received criticism, as StateScoop noted, for over-promising on the ability of its technology and for generating false alarms. Like Eviden, the company’s software can detect weapons and other threats, but unlike Eviden, ZeroEye’s results are reviewed by a human at ZeroEye’s operation center, who then sends a notice to schools and law enforcement. Eviden’s software is more directly accessible to school administrators, which I’ve heard touted as a key benefit — and likely one of the key reasons ZeroEyes wasn’t selected.

Earlier this year, Davidson approved Eviden, which meant New Hanover County could tweak the details of the contract — choosing, if it wanted, different software features — but locked it in to either work with Eviden or walk away from the contract.

Admins approved. The public? Less so.

NHCS
/
WHQR
NHCS Town Hall on March 18, 2025. Yours truly is sitting on the right.

The program has faced considerable pushback in New Hanover County, but for a variety of reasons. One of the most vocal opponents of the program, early on, was Republican school board member David Perry, who addressed his concerns at several public meetings and, earlier this month, penned an opinion piece in The Wilmington Conservative, outlining his criticisms. Democratic board members Judy Justice and Dr. Tim Merrick voiced their own criticisms, including the efficacy of Eviden and the company’s financial stability, and the political connections between Mia Budd — who works as Eviden’s public sector liaison — and her uncle by marriage, Republican U.S. Senator Ted Budd.

While the public reaction to the pilot, including at a town hall meeting earlier this month, was largely negative, it received a lot of support from administrators, principals, and Superintendent Dr. Chris Barnes. Several administrators told me that they felt the current system of cameras is purely reactive. The district has hundreds of cameras, and no realistic way to have human beings watch them all in real time. Most often, camera recordings are reviewed after there’s been an incident, and the cameras themselves play little role in preventing violence or other issues. Barnes and others said that a proactive AI-powered system would be a valuable tool, especially when it comes to violent incidents, where seconds can make a huge difference.

That argument, that any tool to improve student safety ought to be explored, was made by the three Republican board members who supported the pilot — Pete Wildeboer, Pat Bradford, and vice-chair Josie Barnhart.

That left Chairwoman Melissa Mason, a Republican, as the swing vote. According to several people, including board members, who spoke on background, there was considerable pressure on Mason to support the program.

While responses to the AI pilot don’t neatly map onto left-right dichotomies, it was my experience talking to people – at the town hall, online, and around town – that members of the more grassroots, less establishment parts of the GOP were more skeptical of (or hostile to) AI in general, and this program in particular. That segment of the conservative community was a big part of supporting Mason’s 2022 school board election. At the same time, Mason recently ran afoul of the local GOP after appointing Merrick and Justice to school board committees. Mason was defiant that she’d done nothing wrong, but while a recent attempt to officially condemn her actions (falling short of a full censure) didn’t go through, it’s clear there was tension between her and the local conservative establishment. In short, you could see there might be some conflicting forces at work on Mason.

On Tuesday, Mason seemed visibly upset as she described being torn between her desire to increase student safety and her concerns about Eviden. Specifically, Mason said that Davidson County, which also had questions about data security and other issues, had conditioned its acceptance of Eviden on the company’s agreement to participate in a third-party risk assessment. Mason said that the chair of the Davidson school board – Nick Jarvis, no relation to State Senator Jarvis — told her Eviden had pushed back on participating in the assessment. Mason called that “a massive red flag” and said Eviden was “not trustworthy.” She brushed off an attempt by Wildeboer to add New Hanover County’s own condition of a risk assessment and voted with Justice, Merrick, and Perry.

I’ve seen plenty of schadenfreude on the left, celebrating the discord on the right. After all, Lee is a powerful figure in the Republican-run state senate, and the inability to get this pilot program approved — which feels like it should have been a cakewalk — is the kind of thing that would spike the capitol press corps’ FUBAR meter.

But others have actually celebrated the AI pilot as the rare issue where party doesn’t dictate opinion. I certainly enjoyed the break from the predictable grind of political trench warfare but — at the risk of being a buzzkill — there’s still politics at work, and misinformation in play.

"Misinformation"

Mason’s statement was contested by Eviden’s VP of DataOps and Support Shawn Hall, who referred to “misinformation” in an email to Barnhart. Other Eviden representatives have also confirmed that the company has not made any attempt to back away from the third-party risk assessment requested by Davidson’s school board.

Davidson Superintendent Dr. Greggory Slate also said Eviden had not resisted any requests for a third-party review. Slate noted that attorneys had been going “back and forth,” but said the process was moving forward. School board chair Jarvis has not responded to repeated requests for comment, so it’s hard to confirm what he told Mason. But it appears, at best, it wasn’t completely accurate.

Based in part on that, Mason’s statements met with criticism from her fellow Republicans on the school board, including Barnhart. It also prompted a dressing down from conservative commentator Nick Craig, who dismissed her "red flag" statement as “gaslighting” on the Thursday installment of his online show.

I asked Mason if what Slate and Eviden had told me swayed her opinion, but she said no, adding that she had “other concerns,” and noting that it was her understanding that no contract for an assessment had been signed with Davidson. She didn’t offer other specific concerns.

Feature Creep

Others have offered plenty of specific concerns. One of the most consistent I’ve heard concerns the state-required features the vendor must offer, which include:

  • Threatening Object Detection
  • Intruder Detection
  • Person Down Detection
  • Door Open Detection
  • Tag and Track
  • Facial Recognition
  • Forensic Face Search
  • License Plate Reader

Some, like detecting weapons, people falling down, or doors left open, met with little objection (though some were skeptical of Eviden’s ability to deliver on those features, which I think is a different issue). However, other features, especially facial recognition technology, provoked a lot of concern. Even supporters like Barnhart were quick to support approving Eviden with the condition that facial recognition was disabled.

But even with that caveat, and after Eviden and Superintendent Dr. Chris Barnes said the feature would remain ‘turned off,’ the public wasn’t assuaged, based on comments at the town hall. Board members like David Perry also remained skeptical.

“Perhaps to avert the public’s fear about their children being tracked constantly by the AI System, Eviden has proposed that they will not implement the Facial Recognition capability of their system. While the system has the capability of creating a photographic, biometric, and historical record for every student, staff, and citizen that enters school property, we are being assured that this feature will remain off. I asked how Intruder Detection would work if the system didn’t know who is supposed to be in certain parts of the school if the system didn’t know who the staff and students that are supposed to there are, and what they look like? I didn’t get a satisfactory answer,” Perry wrote for The Wilmington Conservative. 

Others have expressed concern that the facial recognition system could be tempting to use — perhaps first in an emergency, and then more frequently for more minor issues, and ultimately just left on full time. Barnes himself acknowledged that, while he promised to be a responsible steward of the technology, it could be a slippery slope.

I think it’s worth pointing out, that while there are absolutely worthwhile concerns about responsible use of the AI technology, when it comes to this pilot, a lot of that hinges on having — or lacking — faith in human beings, like the superintendent and principals.

And, of course, we’re talking about a pilot program — but the long-term goal is likely for a statewide rollout. Whether future state legislation will allow districts to maintain the discretion they have in this pilot program remains to be seen. I think that’s a legitimate concern, but I think — at the same time — you could argue if the General Assembly wanted to jam through an unpopular statewide mandate for AI features like facial recognition, a lot of representatives would be facing some heat from their constituents.

Panopticon

Presidio Modelo in Cuba.
Wikipedia / Friman
/
WHQR
Presidio Modelo in Cuba.

A broader concern I’ve heard about the AI pilot has more to do with the idea of surveillance in general. Perry compared it to the dystopian film Minority Report – a reference to the grim possibility that authorities would ignore warnings about false positives — and others have complained about their children being “put under a microscope,” or “digitally tracked to the nth degree.” One mother compared her student’s school to “the panopticon” — a reference I haven’t heard since my grad school days.

If you, like me, needed a refresher: the panopticon was a prison imagined by Utilitarian philosopher Jeremy Bentham as a ring of detention cells facing a central tower, allowing all the prisoners to see each other, and a guard, themselves concealed, to easily see all the prisoners. Prisoners couldn’t tell if they were being watched, but, because they were so exposed, Bentham reasoned, they would regulate themselves. (Fun fact: Bentham’s actual skeleton, dressed in his original clothes and topped with a wax head, sits on display at University College London, known as an 'auto-icon'.)

An actual panopticon was built in Cuba, known as El Presidio Modelo. It is terrifying.

But more often, panopticon refers to a surveillance state. And it’s easy to understand why people dislike it so much, and their concerns about privacy — and the security of the data that is constantly being gathered about them — seem totally legitimate.

The problem is that we’re already heavily surveilled — and kids, in schools, more so than maybe anyone else (outside of an actual prison). There are already cameras in New Hanover County schools, of course. But there’s also already some AI functionality.

On Thursday, Superintendent Barnes sent out an email addressing the issue. Barnes acknowledged he had recently discovered that four camera systems were replaced in the 2022-2023 fiscal year with cameras from Verkada, a California-based company that provides security systems, including AI features.

“I found that some of our newer cameras (Verkada) came embedded with software that has some features that, given the public's concerns, I want to provide transparency about how they work and what they do,” Barnes wrote.

Verkada’s Command Video Management Platform includes a ‘people analytics’ function, which Barnes said the district has never used, and an AI-powered search, which the district has used.

“This feature helps staff quickly review footage by allowing searches for general attributes. For example, if needed, staff can search for a ‘red backpack’ or ‘black car,’ and the system will analyze recorded footage to return relevant results. This AI-powered search has always been restricted to general terms, not specific information,” Barnes wrote, adding that the feature requires a human to operate it, and that a trained staff member is always involved.

Barnes noted that, “these features came already installed on the cameras when they were purchased in 2022 and 2023. They were not added separately or altered beyond their original design. The system functions only within its intended scope and never autonomously collects or processes data.”

Again, this is technology that is likely to bother some people. And not everyone will be satisfied with the argument that there is a human being — that is, an administrator or other staff member — standing between AI technology and any actual action, like confronting a student or taking any disciplinary action.

Some people just hate the idea of panopticon and don’t trust AI – that’s their right. But, without putting words in Barnes’ mouth, it seems AI-powered surveillance is to some extent the present-day situation, not a dystopian future. The status quo shouldn’t be used as a defense, but I think it’s important to at least lay out what the district is doing right now when we’re debating what it should do, with the current AI pilot or other programs down the road.

No-Bid, Mia Budd, and the illusion of choice

Mia Budd, part of Eviden's team, during a presentation to New Hanover County school board members.
NHCS
/
WHQR
Mia Budd, part of Eviden's team, during a presentation to New Hanover County school board members.

Critics of the AI pilot have also pointed to Eviden’s political connections and the fact that its selection by Davidson (and thus New Hanover) seems to be a fait accompli.

Some of the earliest and most vociverous criticism came from several articles in the Cape Fear Beacon, an anonymous online site founded by Peter LaFond, a Wilmington businessman with a tech background who also helped run Mason’s campaign. These articles pointed out, among other things, that Mia Budd — who presented to the school board as part of Eviden’s team in January — is the niece by marriage of U.S. Senator Tedd Budd.

When Merrick asked Eviden representatives about this during the town hall this month, the crowd responded audibly. When Eviden’s rep noted that Budd is the company’s senior director for ‘public sector’ cybersecurity — essentially a liaison to the government, although not technically a lobbyist — the crowd erupted, gasping and laughing in derision. Merrick quipped that the arraignment “didn’t pass the smell test.” Even some of the fairly right-wing conservatives in the audience applauded Merrick, who they would otherwise decry as a “radical leftist.”

While there are definitely partisan flavors, I’ve heard serious concerns on the left and the right about how Eviden ended up being essentially the only company that met the guidelines laid out in the state statute authorizing the AI pilot. For another example, Davidson County Board of Commissioners Chairman Todd Yates, a Republican, was extremely critical of the process when the county school board and superintendent presented to them in January (you can find the full meeting video here).

“Are you okay with the state taking over your responsibility and your rights?” Yates asked the school board chair.

Some of the frustration, including in Davidson County, is that the statute is written in a way that suggests it’s open to multiple eligible bidders — and, in fact, both New Hanover and Davidson went through the process of putting out requests for proposals and vetted three vendors each, including Eviden.

Davidson’s school superintendent, Dr. Greggory Slate, said he felt the district had a free hand to choose, and argued that other companies could have altered their bid to meet the state’s requirements. That may be true, but it seems like those requirements were specifically written to cater to Eviden’s suite of options — a situation similar to many bids won by ZeroEyes in other states. On paper, the statute avoids a no-bid contract situation — which would require an emergency, a national security concern, or other exemption – but you can understand why some have criticized it.

There’s plenty of smoke here, but it’s not clear if Mia Budd actually played a role in convincing Lee and other legislators that Eviden is the best — and, in fact, only — company for the job. Certainly, her political connections would have helped. But it’s also plausible that Lee thought Eviden was the best company for the job based on their international reputation and went from there.

Again, at some point, it’s likely the General Assembly will want to explore moving from a pilot to broader implementation of the AI security program. At that point, we won’t be talking about $5.2 million (not to say that’s nothing), but hundreds of millions of dollars, and I imagine there will be more scrutiny of the deal — and more questions about which vendors have the opportunity to compete. Companies like Actuate, Rhombus, Avigilon, and — yes — ZeroEyes could all be eying a piece of that pie; Eviden, for its part, would likely want to defend and expand on its foothold in Davidson, one imagines.

It’s also worth noting that, even assuming that Eviden is the best company for the job right now, AI is a rapidly evolving industry. The playing field could look very different in a year.

Why AI?

Matthew McConaughey in an ad for the Agentforce AI personal assistant.
YouTube
/
WHQR
Matthew McConaughey in an ad for the Agentforce AI personal assistant.

If you’ve been reading this newsletter, you know I’m fairly skeptical of AI. I’m not talking about general AI — that is, software that basically produces a conscious mind — but more limited machine-learning software like ChatGPT and its image- and video-based relatives. There are some fun things you can do with AI, but there’s also a lot of hype, and many products that feel like they’re foisting a use-case that doesn’t really exist onto the general public.

Take, for example, the Agentforce AI-powered personal assistant from Salesforce, which was rolled out with a high-profile advertising campaign featuring Matthew McConaughey. One ad features a disgruntled McConaughey, dining al fresco in the rain as a waiter brings him a plate of soggy shrimp cocktail. McConaughey laments that, if only he had Agentforce — and not some inferior AI-powered product — then his digital personal assistant would have known to ‘move his reservation inside’ due to the weather, and wouldn’t have ordered him the shrimp, which is apparently not his thing.

Yes, the implications are really that dumb. Salesforce would have you believe that, without their product, restaurants would seat you in the rain and bring you food you don’t like and didn’t order. 

Over the last five years, there’s been a kind of Tulip Mania for AI. I’ve seen plenty of issues where AI is suggested as a magical fix, yet the number of times where machine learning is actually useful has been much smaller. It doesn’t often seem to save time or money and can produce results plagued by glitches and hallucinations (that is, situations where software confidently provides answers or results that are incorrect or nonsensical).

But that said, there are some things that AI is very good at, mostly involving pattern recognition. Programming software to recognize the geometric patterns of a firearm, the gait of a person walking versus running, or the airflow characteristics of smoke is well within the wheelhouse of AI, and scaling that software to monitor dozens, or hundreds, of security camera feeds is a matter of processing power.

Staffing human employees to monitor dozens of camera feeds at dozens of schools and facilities would take a lot of people who, it’s worth pointing out, would spend a lot of time staring at screens waiting for something to happen.

It remains to be seen if Eviden’s AI software will be useful — presumably, something a pilot would help the district assess. But if it’s effective, it would be worth debating the annual operational cost – around $300,000, according to Eviden — versus the staffing that would provide, perhaps five or six SROS, or a few additional assistant principals.

I think there’s a compelling argument that AI is a more efficient way to monitor school security.

But, at the same time, I understand critics — on the right and left — who are frustrated by politicians’ sudden enthusiasm for AI in schools, and by their willingness to direct taxpayer money to a foreign company over investing in local school employees.

The reason we're all here

There’s at least one more thing that I think should play a role in the debate over the AI pilot, and that’s the level of violence occurring in our schools right now.

Supporters of the program have repeatedly referenced the New Hanover High School shooting from four years ago. Even if you’re vehemently opposed to the pilot, it’s hard to argue against wanting to prevent another incident like that.

But if you talk to principals, teachers, social workers, parents, and students, you’ll hear that there is a lot more violence that happens under the radar that we are not discussing. We occasionally get press releases from the sheriff’s office or the school district when a student is caught on campus with a gun, but there are many other incidents involving other weapons, or just bare fists, that we’re not notified about. Or, at least not officially.

There are good reasons to keep many of these situations confidential, because they involve minors. But I’ve heard criticisms, from liberal-progressive education advocates and conservative Moms for Liberty members alike, that the public is not getting an accurate view of just how bad things have gotten in the schools. However different their political top spin or prescribed solution might be, the diagnosis remains the same.

I’ve heard some suggestions that the board could consider releasing more information, redacting certain details to protect student privacy while still painting a more accurate picture for people. As a journalist, I’d obviously support more transparency, but I think the general public would appreciate it, too — even if it’s unsettling.

That doesn’t mean anyone should automatically change their mind and embrace the AI pilot, or any other future program, simply because 'things are bad.' I’m not suggesting a post-9/11 PATRIOT Act mentality here. If AI isn’t what the public (and, by extension, the board) wants, then I think a lot of people would like that to prompt a discussion of what would work.

Hopefully, whatever the board, or the General Assembly, decides to do, the conversation here in New Hanover County and around the state can stay reasonable, well-informed, and clear about what’s at stake. Hundreds of millions of taxpayer dollars — and over a million public school students — depend on it.

Editor's note: This piece originally referred to the New Hanover County GOP's attempted rebuke of Melissa Mason's actions as a "censure." The technical motion put forward was a "Resolution of Condemnation," which failed. This article has been corrected to accurately reflect that.

Ben Schachtman is a journalist and editor with a focus on local government accountability. He began reporting for Port City Daily in the Wilmington area in 2016 and took over as managing editor there in 2018. He’s a graduate of Rutgers College and later received his MA from NYU and his PhD from SUNY-Stony Brook, both in English Literature. He loves spending time with his wife and playing rock'n'roll very loudly. You can reach him at BSchachtman@whqr.org and find him on Twitter @Ben_Schachtman.