WELCOME to the future, where vending machines aren’t just for drinks and snacks – but rounds of bullets for pistols and rifles.
American Rounds, a company born out of the States, has built “deeply dystopian” kiosk that uses AI technology to age-check customers buying ammunition.
All it takes is to by a round of bullets with these machines is to click on the type of ammunition you want, let the machine scan your ID and make your payment[/caption] There are still doubts over there legality[/caption] One concern is that a human store clerk can see whether a potential buyer appears unstable or unwell, in a way that facial technology currently can’t[/caption]They are both real and legal, according to city officials from Tuscaloosa city, Alabama, cited by local outlet Thread.
The automated dispensaries have been vetted by the Bureau of Alcohol, Tobacco and Firearms (ATF).
However, there are still doubts over there legality.
As Newsweek notes, at least one of the devices has been taken down amid a Tuscaloosa city council investigation into its legality.
Despite the investigation, the machines are now are being placed into a growing number of supermarkets across Alabama, Oklahoma and Texas.
“We have over 200 store requests for AARM [Automated Ammo Retail Machine] units covering approximately nine states currently,” CEO Grant Magers told Newsweek.
“And that number is growing daily.”
In a promotional video, American Rounds CEO Grant Magers claimed the new way of selling ammunition is safer due to the in-built AI-technology.
“Traditionally, ammunition is sold at outdoor-type stores, your sporting goods stores, and it just sits on a shelf and it’s very accessible and because of that, there’s a high rate of theft,” Magers said.
“With our machines, we have a very secure automated retail machine able to age-verify.
“We scan a driver’s license and take 360 facial recognition for the purchase and match it to the ID.”
All it takes is to by a round of bullets with these machines is to click on the type of ammunition you want, let the machine scan your ID and make your payment.
It’s similar to ordering at the self-checkout kiosks in McDonalds – except, you get bullets instead of a Happy Meal, and your ID can stay in your wallet.
According to American Rounds’ website, the machines are accessible “24/7” which lets gunowners “buy ammunition on your own schedule, free from the constraints of store hours and long lines.”
However, as Futurism points out, human employees retain the right to deny gun and ammunition purchases to individuals for any reason, other than discrimination against protected classes.
A human store clerk can see whether a potential buyer appears unstable or unwell, in a way that facial technology currently can’t.
Despite being pretty widespread, facial recognition technology can still be unreliable when picking up the faces of women and racial minorities.
The company also does not have a privacy and security pledge or document on its website.
The vague privacy terms have sparked concern among potential customers.
Although CEO Magers has previously said that the company isn’t selling facial recognition data.
It’s what the founding fathers intended!
Walt Maddox, Tuscaloosa City mayor
In response to the new machines, Tuscaloosa City mayor Walt Maddox joked: “It’s what the founding fathers intended!”
In a comment under the promotional video on YouTube, one onlooker wrote: “People in other countries must be like WTF??? when they see this. LOL. Welcome to America.”
Others have found it less funny, describing it as “deeply dystopian”.
A third person wrote: “Scanning the drivers license and facial recognition is a huge NO from me. I’ll pass.”
The Sun has contacted American Rounds for comment.
Artificial intelligence is a highly contested issue, and it seems everyone has a stance on it. Here are some common arguments against it:
Loss of jobs – Some industry experts argue that AI will create new niches in the job market, and as some roles are eliminated, others will appear. However, many artists and writers insist the argument is ethical, as generative AI tools are being trained on their work and wouldn’t function otherwise.
Ethics – When AI is trained on a dataset, much of the content is taken from the Internet. This is almost always, if not exclusively, done without notifying the people whose work is being taken.
Privacy – Content from personal social media accounts may be fed to language models to train them. Concerns have cropped up as Meta unveils its AI assistants across platforms like Facebook and Instagram. There have been legal challenges to this: in 2016, legislation was created to protect personal data in the EU, and similar laws are in the works in the United States.
Misinformation – As AI tools pulls information from the Internet, they may take things out of context or suffer hallucinations that produce nonsensical answers. Tools like Copilot on Bing and Google’s generative AI in search are always at risk of getting things wrong. Some critics argue this could have lethal effects – such as AI prescribing the wrong health information.