The Weekly Weird #75
Robo-crap, Auntie lies, Gilligan's my-land, collaborative combat, post-cards, algorithmic pricing, unBearable, show me the Monet, the new Ring thing
Welcome back to your Weekly Weird, the regular round-up of irregularities that pass for reality these days!
Here’s one from the vault to kick us off. The first two minutes of the following 1970 interview with Orson Welles are that incomparable raconteur’s description of hiking with a Nazi in the Tyrol pre-war and ending up sitting next to Hitler at a meal.
“He had no personality whatsoever. He was invisible.”
Robo-crap
1X have started pre-sales of their new NEO robot, advertised as an autonomous butler for early adopters and, according to their homepage, Boomers who need even more free time to play an unrecognisable game of cards.
NEO will run you a cool $20,000 up front to own or a $499 monthly subscription, with delivery scheduled for an unspecified date in 2026.
Joanna Stern of the Wall Street Journal investigated:
Stern discovered that it took NEO over a minute to walk ten feet to get a bottle of water out of a fridge, and almost five minutes to put two glasses and a fork into a dishwasher. But the real shocker comes when she is told that the robot is actually being piloted remotely by a guy named Turing Zelsnack1.
It isn’t a robot at all. Everything it did in the video above was via remote control.
Its creators say NEO will run off of an internal neural network eventually, but to begin with, including after being sold as an in-home robot, it will be controlled by a company employee to train it in real-world scenarios.
As 1X CEO Bernt Børnich explained to Stern:
“In 2026, if you buy this product, it is because you are okay with that social contract. If we don’t have your data, we can’t make the product better.”
To be clear, the social contract to which he is referring is having a stranger in control of a 66-pound robot that is loose in your home, with a video and audio feed that is being recorded to train 1X’s AI.
1X describes it on their website as “Expert Mode”:
For any chore it doesn’t know, you can schedule a 1X Expert to guide it, helping NEO learn while getting the job done.
Ronny Chieng on the Daily Show was pretty blunt: “It’s not a robot, it’s some guy!”
George Carlin put it best when he said way back in 1999 that “America’s most profitable business is still the manufacture, packaging, distribution, and marketing of bullshit.”
As if the bullshit, overall creepiness, and potential for cyberstalking were insufficiently off-putting, Børnich dropped the following in his interview with Stern:
“I’m a big fan of what I call, like, the Big Brother/Big Sister principle, right? Big Sister helps you, Big Brother is just there to kind of monitor you, and we are very much the Big Sister.”
Oh. That’s different.
Auntie Lies
The British Broadcasting Corporation (BBC), known affectionately as Auntie, has taken a pounding this week because of the Prescott Report, a document prepared by Michael Prescott and made public by The Telegraph in which, driven to “despair at inaction by the BBC Executive when issues come to light”, he compiled a catalogue of “profound and unresolved concerns about the BBC”.
Among the allegations are that:
BBC’s Panorama programme “spliced together two clips from separate parts of [Trump’s] speech” on January 6, 2021 to create “the impression that Trump said something he did not”, which “materially misled viewers” because he “did not explicitly exhort supporters to go down and fight at Capitol Hill”
“It had taken six months for the BBC to take decisive action about a story that was not fit for purpose and spread damaging misinformation”, specifically that insurance companies were guilty of ethnic discrimination in their premiums
The BBC Push Notifications system was “ignoring immigration issues”, specifically that, in September 2023, “of 219 notifications, just four were about the issues of illegal migrants and asylum seekers [and] [o]f those, three centred on the poor conditions or mistreatment of migrants” despite “[t]hat month [having] seen the highest number of illegal migrants crossing the channel in a single day – a fact covered by both PA News and BBC Quickfire but was not on the BBC PN alerts.”
Multiple complainants claimed that “time and time again the LGBTQ desk staffers would decline to cover any story raising difficult questions about the trans-debate”, “that the desk had been captured by a small group of people promoting the Stonewall view of the debate and keeping other perspectives off-air”, and “that stories raising difficult questions about the ‘trans agenda’ were ignored even if they had been widely taken up and discussed across other media outlets.”
“As an institution the BBC too often views issues of gender and sexuality as a celebration of British diversity rather than exploring the complexities of the subject”, highlighting “a cultural problem across the BBC – that too many of its staff have never considered the idea of ‘gender identity’ to be either spurious or offensive to many people.”
“[T]here is no sign of an open admission by the executive about systemic problems within BBC Arabic”, even after “[m]edia stories about the antisemitic and pro-Hamas views of journalists appearing on BBC Arabic”.
In “an internal report presented to the EGSC on May 14th, 2024, the Committee was again warned of problems with the BBC’s coverage of Israel’s war with Hamas”, including that “[c]laims against Israel seem to be raced to air or online without adequate checks, evidencing either carelessness or a desire always to believe the worse about Israel. The errors come thick and fast, sometimes with “eyewitness” testimony from locals who have Tweeted in praise of the October 7 killings and worse.”
“There are clearly worrying systemic issues with the BBC’s coverage in the areas set out above,” Prescott wrote in his conclusion. “From what I witnessed, I fear the problems could be even more widespread than this summary might suggest.”
The Prescott Report and subsequent public reaction has already led to the resignations of the BBC Director-General Tim Davie and the CEO of BBC News, Deborah Turness.
NB: For those outside the UK, it’s worth pointing out that the BBC is publicly funded through the ‘license fee’, a tax (nearly £180 for 2025/6) levied on every household in Britain that owns a television or streams content. Non-payment of the license fee can result in a heavy fine or even prosecution, which makes BBC malfeasance, misbehaviour, and denial a sore spot in the public square.
BBC Chair Samir Shah, in two letters to the House of Commons, described the “sacred job of the BBC” as being to inform the public “in a way that is impartial, truthful and is based on evidence they can trust”. Despite repeated counter-arguments against the evidence and arguments put forward by Prescott, and an overall ‘mistakes were made’ type of defensive tone, Shah affirmed in the letter that he will “personally ensure that the BBC continues to take the necessary actions in the future”.
In the Guardian’s thread on why the BBC is called Auntie, Dave Bush from Leamington Spa chimed in with his proposed answer:
It was originally a put down, inferring that the BBC did not listen to criti[ci]sm, advice or requests from people other than themselves. “Auntie knows best, Dear!”.
Indeed.
Gilligan’s My-land
Vince Gilligan, creator of Breaking Bad and Better Call Saul, is back with a new dystopian show on Apple TV called Pluribus.
I haven’t watched it yet, but the trailer looks interesting and his Breaking Bad credentials buy him a heck of a lot of rope.
Collaborative Combat
Andúril, “the Flame of the West” in Tolkien’s story universe, is also an American defence company co-founded by Oculus inventor Palmer Luckey (pictured below demonstrating how to play Rock Paper Scissors alone).
Anduril have built an unmanned jet named Fury (or the less-snappy YFQ-44A) that just had its maiden flight.
Fury is “semi-autonomous” and “operates by itself, carrying out mission plans and adjusting its flight without human input”. It is billed as “a collaborative combat aircraft or CCA, operated by artificial intelligence”, with a kill switch that allows a human to stop it.
Anduril assured 60 Minutes that a human being is required in the kill chain to approve lethal action. All this is more movement along the road towards LAWS (lethal autonomous weapons systems) becoming the standard in future warfare.
Anduril are also hiring, with a very 21st century recruitment video to go with their ‘future of warfare’ vibe.
Post-Cards
Mastercard have foretold the end of the “physical form factor” in payments.
Despite having “card” in his employer’s name, Gautam Aggarwal, division president for South Asia at Mastercard, was candid about the future. Speaking at TechSparks 2025, YourStory’s startup tech summit, Aggarwal said, “The card is not the form factor anymore.”
“Existing physical form factors that we know of today — be it the mobile device, be it a card, be it a QR code — I don’t think those will exist.”
One could be lulled by the assumption that we’re talking about biometric payment methods like palm, fingerprint, iris, or voice, but it is much creepier than that.
The company is investing in “agentic commerce AI” technology that enables transactions to happen without any conscious user interaction. This could be where RFID tags and AI systems detect what customers are carrying and automatically charge as they walk out of the store.
While the concept of automated payment on exit isn’t new, the mainstreaming of the idea that transactions should be permitted to happen “without any conscious user interaction” is seriously troubling.
There is something foundational about the cultural assumption that a transaction is signified by you parting with your money, by choice, in a way that is clear and recognisable to you. To toss that foundation aside in the search for ever-more “frictionless” payment is, at least in my opinion, more than a logistical or economic shift. It’s philosophical.
In a time so preoccupied by consent and power in personal interactions, it seems glaringly asymmetrical that a firm in charge of the financial infrastructure of our day-to-day lives would float an alternative that deprives the purchaser in a transaction so completely of both.
Speaking of technology changing how we pay for things in a way that is potentially damaging to the fabric of society…
Algorithmic Pricing
To the uninitiated, algorithmic pricing is the automated adjustment of the cost of a product or service. This isn’t entirely new or bad, such as when airlines raise seat prices closer to a departure (“time” adjustment), but when “surveillance pricing” permits costs to be adjusted from person to person based on detailed information about each individual (“personal” pricing), it gets more complicated and more unpleasant.
The idea that a company will charge more if it can, or that people are willing to pay more for something under certain circumstances, is not exactly earth-shattering. However, when your browser data, location, income, search history, or other personal markers can be accessed and parsed by companies to design a pricing profile that only applies to you, so that you are paying a different price to everyone else, this becomes less about economic factors and more about control, power, and the social coherence of commerce.
As of November 10, New York’s new Algorithmic Pricing Disclosure Act requires “most companies that use algorithmic pricing to clearly display a disclosure notifying consumers that prices are set using their personal data”, and New Yorkers are being encouraged to “file a complaint with the Office of the Attorney General (OAG)” if they feel such pricing has not been properly disclosed using the prescribed formulation “THIS PRICE WAS SET BY AN ALGORITHM USING YOUR PERSONAL DATA”.
A $1,000 penalty per violation awaits non-compliant businesses.
The jury is still out as to whether surveillance pricing is even economically useful in real terms, let alone socially desirable.
As Professor Oren Bar-Gill of Harvard University explains in this short lecture on the social effects of algorithmic price discrimination, the “willingness to pay” in economic models can be made up of “preferences” (neoclassical) or “preferences + (mis)conceptions” (behavioural). In the former case, “the welfare effects of price discrimination are ambiguous”, while in the latter case “discrimination hurts consumers even more and may even reduce efficiency”.
Imagine being in a coffee shop and hearing that the person in front of you was charged $5 for an espresso, and then getting to the counter and being asked to pay $7. Would you pay extra, argue, or leave? How would you feel? Not just towards the business, but towards your fellow customer who paid less?
UnBearable
While we’re on the subject of coffee shops and people paying too much for things, let’s have a quick shudder together over the Starbucks ‘Bearista’ mania that resulted in a sadly predictable combination of consumer desire, TikTok triumphalism, hand-wringing think-pieces, and actual police-were-called violence. Over a $30 cup shaped like a bear.
Show Me The Monet
Xania Monet’s track How Was I Supposed To Know? debuted at #30 on Billboard’s Adult R&B Airplay chart, an event unremarkable except for the fact that Xania Monet does not exist and the song was generated by “Mississippi-based songwriter Telisha “Nikki” Jones”, who used Suno to set her lyrics to AI-generated music.
Billboard have estimated that “five of her songs had generated $52,000 in just over two months”, and “a bidding war” to sign Monet saw “offers reaching $3 million” before a deal was concluded with Hallwood Media.
While Monet is “the first known AI artist to earn enough radio airplay to debut on a Billboard radio chart”, “at least one AI artist has debuted in each of the past six chart weeks, a streak suggesting this trend is quickly accelerating.”
Sure enough, by the time I sat down to write this, America had its first AI #1 song, Walk My Walk by Breaking Rust, which “hit No. 1 on Billboard’s Country Digital Song Sales chart, reaching over 3 million streams on Spotify in less than a month.”
Rick Beato summed it up perfectly on his YouTube channel: “Oh geez.”
The New Ring Thing
The tool, which is expected to roll out this winter, would allow Ring camera owners to tag and identify specific people who come into view of their cameras.
While Amazon says the feature is intended to help residents quickly recognize friends, family members, and frequent visitors, privacy advocates argue it represents a significant expansion of biometric surveillance into neighborhoods, sidewalks, and front doors.
Why does anyone need help recognising their own “friends, family members, and frequent visitors”? Who is this actually for?
EFF elaborates on the problems with the new facial recognition tool:
When turned on, the feature will scan the faces of all people who approach the camera to try and find a match with a list of pre-saved faces. This will include many people who have not consented to a face scan, including friends and family, political canvassers, postal workers, delivery drivers, children selling cookies, or maybe even some people passing on the sidewalk.
Massachusetts Senator Ed Markey called out Amazon for the intrusion:
“Although Amazon stated that Ring doorbell owners must opt in to activate the new facial recognition feature, that safeguard does not extend to individuals who are unknowingly captured on video by a Ring doorbell camera. These individuals never receive notice, let alone the opportunity to opt in or out of having their face scanned and logged in a database using FRT. To put it plainly, Amazon’s system forces non-consenting bystanders into a biometric database without their knowledge or consent. This is an unacceptable privacy violation.”
EFF explains why this is a serious issue for civil liberties:
Today’s feature to recognize your friend at your front door can easily be repurposed tomorrow for mass surveillance. Ring’s close partnership with police amplifies that threat. For example, in a city dense with face recognition cameras, the entirety of a person’s movements could be tracked with the click of a button, or all people could be identified at a particular location. A recent and unrelated private-public partnership in New Orleans unfortunately shows that mass surveillance through face recognition is not some far flung concern.
This comes on the heels of recent reporting from Biometric Update that Amazon Web Services (AWS) is “becoming a key supplier and broker for police, using its massive infrastructure and network of partners to place AI at the core of modern surveillance.”
AWS has positioned itself not just as a host of police data but also as a promoter and intermediary for surveillance tools. Internal emails from West Coast law enforcement agencies show that AWS’s “law enforcement and school safety” team has been pushing a portfolio of surveillance technologies directly to police departments.
These include license-plate tracking from Flock Safety; analytics and data fusion tools from Lucidus Solutions (now part of Flock) and C3 AI; weapons detection from ZeroEyes; bodycam-based reporting from Abel Police and Mark43; video search and face tracking from Veritone; and voice analytics of inmate calls via Leo Technologies’ Verus. Together, these tools target a police technology market that could exceed $11 billion.
But Amazon Ring’s facial recognition on your doorstep will be strictly limited to “eliminating guesswork and making it effortless to find and review important moments involving specific familiar people”.
That’s it for this week’s Weird, everyone. I hope you enjoyed it.
Outro music is Fake News BBC, an AI parody song released today in response to the ongoing scandal. Make sure you stay tuned for Jimmy Savile’s saxophone solo at the end, and watch out for a range of (AI-generated) celebrity cameos.
One hundred and eighty quid for this rot
Stuffed by the Beeb, now we’re tied in a knot
Stay sane, friends.
Yes, really.






