In brief The case against a man accused of murder has been thrown out by a judge after prosecutors withdrew disputed evidence of an AI-identified gunshot sound.
Michael Williams, 65, who denied any wrongdoing, sat in jail for 11 months awaiting trial for allegedly killing Safarian Herring, 25.
It’s said that in May last year, Williams was driving through Chicago one night hoping to buy some cigarettes. Herring waved him down for a ride, and Williams, recognizing the younger man from the neighborhood, let him into his car. Soon after another vehicle pulled up alongside, and someone in a passenger seat took out a gun and shot Herring in the head, Williams told police. Herring’s mother said her son, an aspiring chef, had been shot at two weeks earlier at a bus stop.
Herring, who was taken to hospital by Williams, died from the gunshot wound, and Williams ended up being charged with his murder. A key piece of evidence against him came from ShotSpotter, a company that operates microphones spread across US cities including Chicago that, with the aid of machine-learning algorithms, detect and identify gunshot sounds to immediately alert the cops.
Prosecutors said ShotSpotter picked up a gunshot sound where Williams was seen on surveillance camera footage in his car, putting it all forward as proof that Williams shot Herring right there and then. Police did not cite a motive, had no eyewitnesses, and did not find the gun used in the attack. Williams did have a criminal history, though, having served time for attempted murder, robbery, and discharging a firearm when he was younger, and said he had turned his life around significantly since. He was grilled by detectives, and booked.
Crucially, Williams’ lawyers – public defenders Lisa Boughton and Brendan Max – said records showed that ShotSpotter actually initially picked up what sounded like a firework a mile away, and this was later reclassified by ShotSpotter staff to be a gunshot at the intersection where and when Williams was seen on camera. ShotSpotter strongly insisted it had not improperly altered any data to favor the police’s case, and said that regardless of the initial real-time alert, its evidence of the gunshot was the result of follow-up forensic analysis, which was submitted to the courts.
After Williams’ lawyers asked the judge in the case to carry out an inquiry, the prosecution last month withdrew the ShotSpotter report, and asked for the case to the dismissed on the basis of insufficient evidence, which the judge agreed to. Williams is now a free man again.
“I kept trying to figure out, how can they get away with using the technology like that against me,” Williams told the Associated Press for an in-depth investigation into the case published this week. “That’s not fair.”
The internet used our AI to make NSFW images!
Startup Kapwing, which built a web application that uses computer-vision algorithms to generate pictures for people, is disappointed netizens used the code to produce NSFW material.
The software employs a combination of VQGAN and CLIP – made by researchers at the University of Heidelberg and OpenAI, respectively – to turn text prompts into images. This approach was popularised by artist Katherine Crowson in a Google Collab notebook; there’s a Twitter account dedicated to showing off this type of computer art.
Kapwing had hoped its implementation of VQGAN and CLIP on the web would be used to make art from users’ requests; instead, we’re told, it was used to make filth.
“Since I work at Kapwing, an online video editor, making an AI art and video generator seemed like a project that would be right up our alley,” Eric Lu, co-founder and CTO at Kapwing said.
“The problem? When we made it possible for anyone to generate art with artificial intelligence, barely anyone used it to make actual art. Instead, our AI model was forced to make videos for random inputs, trolling queries, and NSFW intents.”
Submitted prompts ranged from “naked woman” to the downright bizarre “thong bikini covered in chocolate” or “gay unicorn at a funeral.” The funny thing is, the images made by the AI aren’t even that realistic nor sexually explicit. Below is an example output for “naked woman.”
Click to enlarge
“Is is that the internet just craves NSFW content so much that they will type it anywhere? Or do people have a propensity to try to abuse AI systems?” Lu continued. “Either way, the content outputted must have [been] disappointing to these users, as most of the representations outputted by our models were abstract.”
Intel ‘winds down’ RealSense biz
Intel is shuttering its RealSense computer-vision product wing. The business unit’s chips, cameras, LiDAR, hardware modules, and software were aimed at things like digital signage, 3D scanning, robotics, and facial-authentication systems.
Now the plug’s been pulled, and RealSense boss Sagi Ben Moshe is departing Intel after a decade at the semiconductor goliath.
“We are winding down our RealSense business and transitioning our computer vision talent, technology and products to focus on advancing innovative technologies that better support our core businesses and IDM 2.0 strategy,” an Intel spokesperson told CRN.
All RealSense products will be discontinued, though it appears its stereo cameras for depth perception will stay, to some degree, according to IEEE’s Spectrum. ®